The Secret Poceedings of the Kansas Supreme Court...

by Denis Boyles


Locking the courthouse door may seem like a lousy way to insure fair justice for all, but holding secret hearings on one of the state's most controversial issues is exactly what the Kansas Supreme Court is doing.

Most of us don't trust courts that operate in the dark. Americans, observed Justice Hugo Black 60 years ago, have a "historic distrust of secret proceedings, their inherent dangers to freedom, and the universal requirement of our federal and state governments that criminal trials be public."

Here's a short list of places where secret court proceedings are not unknown:

  • North Korea
  • Iran
  • China
  • Cuba
  • Syria
  • Zimbabwe
  • Kansas

All those secretive Syrians and enigmatic North Koreans probably would beg to differ, but, to paraphrase everybody's favorite Sunflower cliché, "what's up with Kansas?" How did it hop onto that short list of kangaroo judiciaries?

Back in June 2007, Planned Parenthood of Kansas and Mid-Missouri filed charges in the Kansas Supreme Court against former Attorney General and Johnson County District Attorney Phill Kline, all part of the ongoing battle by abortion clinics to prevent government enforcement of state laws regarding late-term abortions and child molestation.

Peter Brownlie, Planned Parenthood's CEO, confirmed the filing and that's the last we've heard, because Planned Parenthood requested a secret hearing, and the Kansas Supreme Court gave them one. That meant, according to David Klepper, blogging at the Kansas City Star, "the public couldn't see what the court case involved, couldn't read the filings, couldn't sit in on what surely must have been a fascinating hearing before the Supreme Court."

It's risky business when courts invite ridicule, but at the Kansas Supreme Court, the invitation's a standing one. Because of the eccentricities of state law, none of the supreme court's justices have ever been vetted by elected representatives. As many critics, including KU law professor Stephen J. Ware, have complained, "..there's no confirmation process at all" the governor appoints them and there they sit, sometimes dozing through cases that often seem to have already been decided by some backroom handshake.

Because Kansas has never had a conservative governor, there's not even much political diversity on the court. All the members are in general agreement on the way things ought to be in Kansas in fact, in 2005, they even started passing legislation of their own, deciding to the penny how much the state should spend on educating kids. Most of them have, at one time or other, made clear their impatience with wing-nuts and others who disagree with them.

You'd think conservatives would be pleased with a court that has moved so far back in time that its hearings resemble the Star Chamber trials that ended the reign and the life of Britain's Charles the First back in the 1600s.

But no. this afternoon, Rep. Lance Kinzer's House Judiciary Committee will hold hearings "public's invited, of course"on HB 2825, a crowbar bill that would pry open courtroom doors across the state by limiting the ability of judges to conduct secret trials and hearings or have their pleadings sealed.

The Planned Parenthood v Kline case triggered Kinzer's concern, but, as he wrote in an email, the bill is "more of an open [government] issue than a pro-life issue." In a statement released yesterday, Kinzer wrote, "The public has a fundamental interest in all cases that are submitted to a court for resolution. It is an unfortunate reality today that many of the most important public policy issues facing our State are being decided by courts. As such it is more important than ever that our judicial process is open and accessible."

An open court presided over by justices who have been through a public confirmation process? There's a wild and crazy idea, one that's never been tried in Teheran or in Topeka.


Denis Boyles, comments on the media and the Midwest for National Review Online, also writes the Monday, Monday column for Kansas Liberty. He's the author of Superior, Nebraska, an oddly-titled book mostly about Kansas.



Environmental Effects of Increased Atmospheric Carbon Dioxide

{This article supports the observation that Anthropogenic sources of carbon dioxide are not causing global warming or climate change. As can clearly be seen in the Figures, the Medieval warming period shows much greater temperatures than those of post modern industrialization. Additionally, sea levels have been rising well before the onset of the oil and gas age.

This is an abridged version of the original piece by Arthur Robinson, Noah Robinson and Willie Soon, a bit lengthy but well worth the read. It is reprinted by permission of the authors. The complete article is available here. - Ed. }

by Arthur Robinson, Noah E. Robinson, and Willie Soon


Figure 1 (Main photo): Average length of 169 glaciers from 1700 to 2000 (4). The principal source of melt energy is solar radiation. Variations in glacier mass and length are primarily due to temperature and precipitation (5,6). This melting trend lags the temperature increase by about 20 years, so it predates the 6-fold increase in hydrocarbon use (7) even more than shown in the figure. Hydrocarbon use could not have caused this shortening trend.

Political leaders gathered in Kyoto, Japan, in December 1997 to consider a world treaty restricting human production of "greenhouse gases," chiefly carbon dioxide (CO2). They feared that CO2 would result in "human-caused global warming" – hypothetical severe increases in Earth's temperatures, with disastrous environmental consequences. During the past 10 years, many political efforts have been made to force worldwide agreement to the Kyoto treaty.

When we reviewed this subject in 1998 (1,2), existing satellite records were short and were centered on a period of changing intermediate temperature trends. Additional experimental data have now been obtained, so better answers to the questions raised by the hypothesis of "human-caused global warming" are now available.

Figure 2: Surface temperatures in the Sargasso Sea, a two million square mile region of the Atlantic Ocean, with time resolution of 50 to 100 years and ending in 1975, as determined by isotope ratios of marine organism remains in sediment at the bottom of the sea (3). The horizontal line is the average temperature for this 3,000-year period. The Little Ice Age and Medieval Climate Optimum were naturally occurring, extended intervals of climate departures from the mean. A value of 0.25 °C, which is the change in Sargasso Sea temperature between 1975 and 2006, has been added to the 1975 data in order to provide a 2006 temperature value.

The average temperature of the Earth has varied within a range of about 3°C during the past 3,000 years. It is currently increasing as the Earth recovers from a period that is known as the Little Ice Age, as shown in Figure 2. George Washington and his army were at Valley Forge during the coldest era in 1,500 years, but even then the temperature was only about 1° Centigrade below the 3,000-year average.

During the Medieval Climate Optimum, temperatures were warm enough to allow the colonization of Greenland. These colonies were abandoned after the onset of colder temperatures. For the past 200 to 300 years, Earth temperatures have been gradually recovering (26). Sargasso Sea temperatures are now approximately equal to the average for the previous 3,000 years.

The historical record does not contain any report of "global warming" catastrophes, even though temperatures have been higher than they are now during much of the last three millennia.

Figure 3: Arctic surface air temperature compared with total solar irradiance as measured by sunspot cycle amplitude, sunspot cycle length, solar equatorial rotation rate, fraction of penumbral spots, and decay rate of the 11-year sunspot cycle (8,9). Solar irradiance correlates well with Arctic temperature, while hydrocarbon use (7) does not correlate.

The most recent part of this warming period is reflected by shortening of world glaciers, as shown in Figure 4. Glaciers regularly lengthen and shorten in delayed correlation with cooling and warming trends. Shortening lags temperature by about 20 years, so the current warming trend began in about 1800.

Figure 4: Average length of 169 glaciers from 1700 to 2000 (4). The principal source of melt energy is solar radiation. Variations in glacier mass and length are primarily due to temperature and precipitation (5,6). This melting trend lags the temperature increase by about 20 years, so it predates the 6-fold increase in hydrocarbon use (7) even more than shown in the figure. Hydrocarbon use could not have caused this shortening trend.

Surface temperatures in the United States during the past century reflect this natural warming trend and its correlation with solar activity, as shown in Figure 3. Compiled U.S. surface temperatures have increased about 0.5 °C per century, which is consistent with other historical values of 0.4 to 0.5 °C per century during the recovery from the Little Ice Age (13-17). This temperature change is slight as compared with other natural variations. Three intermediate trends are evident, including the decreasing trend used to justify fears of "global cooling" in the 1970s.

ATMOSPHERIC CARBON DIOXIDE

During the past 50 years, atmospheric CO2 has increased by 22%. The magnitude of this atmospheric increase is currently about 4 gigatons (Gt C) of carbon per year. Total human industrial CO2 production, primarily from use of coal, oil, and natural gas and the production of cement, is currently about 8 Gt C per year (7,56,57). Humans also exhale about 0.6 Gt C per year, which has been sequestered by plants from atmospheric CO2. Office air concentrations often exceed 1,000 ppm CO2.

Much of that CO2 increase is attributable to the 6-fold increase in human use of hydrocarbon energy. However, figures 2, 3, &4 show that human use of hydrocarbons has not caused the observed increases in temperature.

Between 1900 and 2000, on absolute scales of solar irradiance and degrees Kelvin, solar activity increased 0.19%, while a 0.5 °C temperature change is 0.21%. This is in good agreement with estimates that Earth's temperature would be reduced by 0.6 °C through particulate blocking of the sun by 0.2%(18).

Figure 5: U.S. surface temperature from Figure 3 as compared with total solar irradiance (19).


Between 1900 and 2006, Antarctic CO2 increased 30% per 0.1 °C temperature change (72), and world CO2 increased 30% per 0.5 °C. In addition to ocean out-gassing, CO2 from human use of hydrocarbons is a new source. Neither this new source nor the older natural CO2 sources are causing atmospheric temperature to change.

Carbon dioxide has a very short residence time in the atmosphere. Beginning with the 7 to 10-year half-time of CO2 in the atmosphere estimated by Revelle and Seuss (69), there were 36 estimates of the atmospheric CO2 half-time based upon experimental measurements published between 1957 and 1992 (59). These range between 2 and 25 years, with a mean of 7.5, a median of 7.6, and an upper range average of about 10. Of the 36 values, 33 are 10 years or less.

There is no experimental evidence to support computer model estimates (73) of a CO2 atmospheric "lifetime" of 300 years or more.


FERTILIZATION OF PLANTS BY CO2


How high will the CO2 concentration of the atmosphere ultimately rise if mankind continues to increase the use of coal, oil, and natural gas? At ultimate equilibrium with the ocean and other reservoirs there will probably be very little increase. The current rise is a non-equilibrium result of the rate of approach to equilibrium. One reservoir that would moderate the increase is especially important.

Plant life provides a large sink for CO2. Using current knowledge about the increased growth rates of plants and assuming increased CO2 release as compared to current emissions, it has been estimated that atmospheric CO2 levels may rise to about 600 ppm before leveling off. At that level, CO2 absorption by increased Earth biomass is able to absorb about 10 Gt C per year (100).

Does a catastrophic amplification of these trends with damaging climatological consequences lie ahead? There are no experimental data that suggest this. There is also no experimentally validated theoretical evidence of such an amplification.

GLOBAL WARMING HYPOTHESIS

Predictions of catastrophic global warming are based on computer climate modeling, a branch of science still in its infancy. The empirical evidence – actual measurements of Earth's temperature and climate – shows no man-made warming trend. Indeed, during four of the seven decades since 1940 when average CO2 levels steadily increased, U.S. average temperatures were actually decreasing. While CO2 levels have increased substantially and are expected to continue doing so and humans have been responsible for part of this increase, the effect on the environment has been benign.

Not only has the global warming hypothesis failed experimental tests, it is theoretically flawed as well. It can reasonably be argued that cooling from negative physical and biological feedbacks to greenhouse gases nullifies the slight initial temperature rise (84,86).

Figure 6: Qualitative illustration of greenhouse warming. "Present GHE" is the current greenhouse effect from all atmospheric phenomena. "Radiative effect of CO2" is the added greenhouse radiative effect from doubling CO2 without consideration of other atmospheric components. "Hypothesis 1 IPCC" is the hypothetical amplification effect assumed by IPCC. "Hypothesis 2" is the hypothetical moderation effect.


When an increase in CO2 increases the radiative input to the atmosphere, how and in which direction does the atmosphere respond? Hypotheses about this response differ and are schematically shown in Figure 6. Without the water-vapor greenhouse effect, the Earth would be about 14 ºC cooler (81). The radiative contribution of doubling atmospheric CO2 is minor, but this radiative greenhouse effect is treated quite differently by different climate hypotheses.

The hypotheses that the IPCC (82,83) has chosen to adopt predict that the effect of CO2 is amplified by the atmosphere, especially by water vapor, to produce a large temperature increase. Other hypotheses, shown as hypothesis 2, predict the opposite – that the atmospheric response will counteract the CO2 increase and result in insignificant changes in global temperature (81,84,85,91,92). The experimental evidence, as described above, favors hypothesis 2.

The computer climate models upon which "human-caused global warming" is based have substantial uncertainties and are markedly unreliable. This is not surprising, since the climate is a coupled, non-linear dynamical system. It is very complex. Figure 7 illustrates the difficulties by comparing the radiative CO2 greenhouse effect with correction factors and uncertainties in some of the parameters in the computer climate calculations. Other factors, too, such as the chemical and climatic influence of volcanoes, cannot now be reliably computer modeled.

Figure 7: The radiative greenhouse effect of doubling the concentration of atmospheric CO2 (right bar) as compared with four of the uncertainties in the computer climate models.

The greenhouse effect amplifies solar warming of the earth. Greenhouse gases such as H2O, CO2, and CH4 in the Earth's atmosphere, through combined convective readjustments and the radiative blanketing effect, essentially decrease the net escape of terrestrial thermal infrared radiation. Increasing CO2, therefore, effectively increases radiative energy input to the Earth's atmosphere. The path of this radiative input is complex. It is redistributed, both vertically and horizontally, by various physical processes, including advection, convection, and diffusion in the atmosphere and ocean.

The reasons for the failure of the computer climate models are subjects of scientific debate (87). For example, water vapor is the largest contributor to the overall greenhouse effect (88). It has been suggested that the climate models treat feedbacks from clouds, water vapor, and related hydrology incorrectly (85,89-92).

The 3,000-year temperature record illustrated in Figure 1 also provides a test of the computer models. The historical temperature record shows that the Earth has previously warmed far more than could be caused by CO2 itself. Since these past warming cycles have not initiated water-vapor-mediated atmospheric warming catastrophes, it is evident that weaker effects from CO2 cannot do so.

There is no indication whatever in the experimental data that an abrupt or remarkable change in any of the ordinary natural climate variables is beginning or will begin to take place.

CONCLUSIONS

There are no experimental data to support the hypothesis that increases in human hydrocarbon use or in atmospheric carbon dioxide and other greenhouse gases are causing or can be expected to cause unfavorable changes in global temperatures, weather, or landscape. There is no reason to limit human production of CO2, CH4, and other minor greenhouse gases as has been proposed (82,83,97,123).

We also need not worry about environmental calamities even if the current natural warming trend continues. The Earth has been much warmer during the past 3,000 years without catastrophic effects. Warmer weather extends growing seasons and generally improves the habitability of colder regions.

As coal, oil, and natural gas are used to feed and lift from poverty vast numbers of people across the globe, more CO2 will be released into the atmosphere. This will help to maintain and improve the health, longevity, prosperity, and productivity of all people.


References:

1. Robinson, A. B., Baliunas, S. L., Soon, W., and Robinson, Z. W. (1998) Journal of American Physicians and Surgeons 3, 171-178.

2. Soon, W., Baliunas, S. L., Robinson, A. B., and Robinson, Z. W. (1999) Climate Res. 13, 149-164.

3. Keigwin, L. D. (1996) Science 274, 1504-1508. ftp://ftp.ncdc.noaa.gov/pub/data/paleo/contributions_by_author/keigwin1996/

4. Oerlemanns, J. (2005) Science 308, 675-677.

5. Oerlemanns, J., Björnsson, H., Kuhn, M., Obleitner, F., Palsson, F., Smeets, C. J. P. P., Vugts, H. F., and De Wolde, J. (1999) Boundary-Layer Meteorology 92, 3-26.

6. Greuell, W. and Smeets, P. (2001) J. Geophysical Res. 106, 31717-31727.

7. Marland, G., Boden, T. A., and Andres, R. J. (2007) Global, Regional, and National CO2 Emissions. In Trends: A Compendium of Data on Global Change. Carbon Dioxide Information Analysis Center,Oak Ridge National Laboratory, U.S. Department of Energy, Oak Ridge, TN, USA, http://cdiac.ornl.gov/trends/emis/tre_glob.htm

8. Soon, W. (2005) Geophysical Research Letters 32, 2005GL023429.

9. Hoyt, D. V. and Schatten, K. H. (1993) J. Geophysical Res. 98, 18895-18906.

10. - 132. are available in the original publication.



Cold Facts on Global Warming: Global temperature caused by doubling atmospheric CO2 is bounded by an upper limit of 1.4-2.7 degrees centigrade

{The article below supports the observation that Anthropogenic sources of carbon dioxide are not significantly affecting either global warming or climate change. These man made sources of CO2 are calculated and documented in Global Warming or global fraud? This is an abridged version of the original T. J. Nelson piece, still quite lengthy but well worth the read. It is reprinted by permission of the author. The complete article is available here. - Ed. }

Photo above: Absorption of ultraviolet, visible, and infrared radiation by various gases in the atmosphere. Most of the ultraviolet light (below 0.3 microns) is absorbed by ozone (O3) and oxygen (O2). Carbon dioxide has three large absorption bands in the infrared region at about 2.7, 4.3, and 15 microns. Water has several absorption bands in the infrared, and even has some absorption well into the microwave region. {Peixoto, J.P. and Oort, A.H., Physics of Climate, Springer, 1992, p. 118.}



by T. J. Nelson

What is the contribution of anthropogenic carbon dioxide to global warming? This question has been the subject of many heated arguments, and a great deal of hysteria. In this article, we will consider a simple calculation, based on well-accepted facts, that shows that the expected global temperature increase caused by doubling atmospheric carbon dioxide levels is bounded by an upper limit of 1.4-2.7 degrees centigrade. This result contrasts with the results of the IPCC's climate models, whose projections are shown to be unrealistically high.

There is general agreement that the Earth is naturally warmed to some extent by atmospheric gases, principally water vapor, in what is often called a "greenhouse effect". The Earth absorbs enough radiation from the sun to raise its temperature by 0.5 degrees per day, but is theoretically capable of emitting sufficient long-wave radiation to cool itself by 5 times this amount. The Earth maintains its energy balance in part by absorption of the outgoing longwave radiation in the atmosphere, which causes warming.

On this basis, it has been estimated that the current level of warming is on the order of 33 degrees C. That is to say, in the absence of so-called greenhouse gases, the Earth would be 33 degrees cooler than it is today, or about 255 K (-0.4° F). Of these greenhouse gases, water is by far the most important. Although estimates of the contribution from water vapor vary widely, most sources place it between 90 and 95% of the warming effect, or about 30-31 of the 33 degrees. Carbon dioxide, although present in much lower concentrations than water, absorbs more infrared radiation than water on a per-molecule basis and contributes about 84% of the total non-water greenhouse gas equivalents, or about 4.2-8.4% of the total greenhouse gas effect.

The 33 degree increase in temperature is not caused simply by absorption of radiation but mostly by the Earth's adaptation to higher temperatures, which includes secondary effects such as increased water vapor, cloud formation, and changes in albedo or surface reflectivity caused by melting and aging of snow and ice. Accurately calculating the relative contribution of each of these components presents major difficulties.

Traditionally, greenhouse gas levels are presented as dimensionless numbers representing parts per billion (ppb) multiplied by a scaling factor (global warming potential- GWP) that allows their relative efficiency of producing global temperature increases to be compared. For carbon dioxide, this scaling factor is 1.0. The factors for methane and nitrous oxide are 21 and 310, respectively, while sulfur hexafluoride is 23,900 times more effective than carbon dioxide. The GWP from carbon dioxide is primarily due to the position of its absorption bands in the critical longwave infrared region at 2, 3, 5, and 13-17 micrometers.

Methane, nitrous oxide, ozone, CFCs and other miscellaneous gases absorb radiation even more efficiently than carbon dioxide, but are also present at much lower concentrations. Their high GWP results from their molecular structure which makes them absorb strongly and at different wavelengths from water vapor and carbon dioxide. For example, although ozone is usually thought of as an absorber of ultraviolet radiation, it also absorbs longwave infrared at 9.6 micrometers. These gases account for another 1.3% of the natural greenhouse gas effect. The increase in the global energy balance caused by greenhouse gases is called "radiative forcing".

The GWP of a greenhouse gas is the ratio of the time-integrated radiative forcing from 1 kg of the gas in question compared to 1 kg of carbon dioxide. These GWP values are calculated over a 100 year time horizon and take into consideration not only the absorption of radiation at different wavelengths, but also the different atmospheric lifetimes of each gas and secondary effects such as effects on water vapor. For some gases, the GWP is too complex to calculate because the gas participates in complex chemical reactions. Most researchers use the GWPs compiled by the World Meteorological Organization (WMO).

Even though most of the so-called greenhouse effect is caused by water vapor, about 1-2 degrees of our current empirically-measured temperature of roughly 288 K (59° F) can be attributed to carbon dioxide. Water vapor is at least 99.99% of 'natural' origin, which is to say that no amount of deindustrialization could ever significantly change the amount of water vapor in the atmosphere. Thus, climatologists have concentrated mostly on carbon dioxide and methane.

Figures from the U.S. Department of Energy show that the pre-industrial baseline of carbon dioxide is 288,000 ppb. The total current carbon dioxide is 368,400 parts per billion, or 0.0368% of the atmosphere. The ocean and biosphere possess a large buffering capacity, mainly because of carbon dioxide's large solubility in water. Because of this, it's safe to conclude that the anthropogenic component of atmospheric carbon dioxide concentration will continue to remain roughly proportional to the rate of carbon dioxide emissions and therefore is in no danger of being saturated, which would allow all the emitted carbon dioxide to go into the atmosphere.

Of course, climate, like weather, is complex, nonlinear, and perhaps even chaotic. Increased solar irradiation can lower the albedo, which would amplify any effect caused by changes in solar flux, making the relation between radiation and temperature greater than linear.

Increased temperatures also cause increased evaporation of sea water, which can cause warming because of water's greenhouse effect, and also can affect the radiation flux by creating additional clouds.

The arithmetic of absorption of infrared radiation also works to decrease the linearity. Absorption of light follows a logarithmic curve -->

Figure 1: Transmitted light is a logarithmic function of concentration. This curve is the familiar Beer's Law.

as the amount of absorbing substance increases. It is generally accepted that the concentration of carbon dioxide in the atmosphere is already high enough to absorb almost all the infrared radiation in the main carbon dioxide absorption bands over a distance of only a few km. Thus, even if the atmosphere were heavily laden with carbon dioxide, it would still only cause an incremental increase in the amount of infrared absorption over current levels.

The net effect of all these processes is that doubling carbon dioxide would not double the amount of global warming. In fact, the effect of carbon dioxide is roughly logarithmic. Each time carbon dioxide (or some other greenhouse gas) is doubled, the increase in temperature is less than the previous increase. The reason for this is that, eventually, all the longwave radiation that can be absorbed has already been absorbed. It would be analogous to closing more and more shades over the windows of your house on a sunny day -- it soon reaches the point where doubling the number of shades can't make it any darker.

What does saturation mean?

The "saturation" argument does not mean that global warming doesn't occur. What saturation tells us is that exponentially higher levels of CO2 would be needed to produce a linear increase in absorption, and hence temperature. This is basic physics. Beer's law has not been repealed. CO2 is very nearly homogeneous throughout the atmosphere, so its concentration (as a percentage of the total) is about the same at all altitudes.

The presence or absence of water vapor has no bearing on whether radiation is absorbed by CO2. Water vapor is a red herring: it has essentially no effect on what CO2 does. Where water vapor becomes important is in the earth's response to CO2.

Saturation does not tell us whether CO2 can raise the atmospheric temperature, but it gives us a powerful clue about the shape of the curve of temperature vs. concentration.

Linear Climate projections:

The linear projection shown here, while obviously simplistic, is a more straightforward argument than those used in climate models, because it does not treat the radiative forcing caused by carbon dioxide separately from the planet's adaptation to it. In other words, we did not just build a model and add carbon dioxide, but instead took numbers that are based on empirical measurements and extended them by a small percentage.

Figure 2: Estimated greenhouse gas-induced global warming plotted against greenhouse gas concentrations expressed as a percentage of current-day values. The black curve is a linear extrapolation calculated from the DOE estimates of total current greenhouse gases. The sharp jump at the right is the data point from one computer model that predicts a nine degree increase from doubling current levels of carbon dioxide.

From the numbers, it is easy to calculate, assuming a linear dependence of temperature on greenhouse gas concentrations, that a doubling of atmospheric carbon dioxide

would produce an additional warming of (0.042 to 0.084) x 33 = 1.38 to 2.77 degrees centigrade.

Our calculation also assumes that the increase in temperature is linearly proportional to the greenhouse gas levels. However, as indicated above, the relationship is not linear, but logarithmic. A plot of temperature vs. gas concentration (expressed as a percentage of current-day levels) would be a convex curve, something like the blue curve in Figure 2. Thus, 1.4-2.7 degrees is an upper bound, and depending on the exact shape of the blue curve, could be an overestimate of the warming effect.

This 1.4-2.7 degree estimate is comparable to the estimate of 1.4 degrees associated with the "empiricist" school of the University of Delaware, University of Virginia, and Arizona State University. An increase of 1.4 degrees was also predicted by P.J. Michaels and R.C. Balling using the NCAR Community Climate Model 3 model, after the large increases in projected carbon dioxide in the original paper in which the model was described were replaced with more realistic ones.

Because a linear increase in temperature requires an exponential increase in carbon dioxide (thanks to the physics of radiation absorption described above), we know that the next two-fold increase in CO2 will produce exactly the same temperature increase as the previous two-fold increase. Although we haven't had a two-fold increase yet, it is easy to calculate from the observed values what to expect.

Between 1900 and 2000, atmospheric CO2 increased from 295 to 365 ppm, while temperatures increased about 0.57 degrees C (using the value cited by Al Gore and others). It is simple to calculate the proportionality constant (call it 'k') between the observed increase in CO2 and the observed temperature increase:

This shows that doubling CO2 over its current values should increase the earth's temperature by about 1.85 degrees C. Doubling it again would raise the temperature another 1.85 degrees C. Since these numbers are based on actual measurements, not models, they include the effects of amplification. These estimates assume that the correlation between global temperature and carbon dioxide is causal in nature. Therefore, the 1.85 degree estimate should also be regarded as an upper limit.

Figure 3: Calculated Temperature rise as a function of CO2 concentration.

It goes without saying that the results shown here depend on the accuracy of the original 33 degree estimate and the validity of extrapolating the existing curve out by an additional small increment. However, we can check the plausibility of the IPCC's result by asking the following question: Instead of 33 degrees, what number would result if we calculated backwards from the IPCC estimates?

Using the same assumption of linearity, if a 9 degree increase resulted from the above-mentioned increase of greenhouse gas levels, the current greenhouse gas level (which is by definition 100%) would be equivalent to a greenhouse gas-induced temperature increase of at least 107 degrees C. This means the for the 9 degree figure to be correct, the current global temperature would have to be at least 255 + 107 - 273 = 89 degrees centigrade, or 192° Fahrenheit! A model that predicts a current-day temperature well above the highest-ever observed temperature is clearly in need of serious tweaking.

What about secondary effects, such as ice melting, changes in albedo, and so forth? Doesn't this increase the predicted temperature beyond the 1.39 to 1.76 degree estimate? In short, no. Because these calculations are based on observed measurements, they automatically take into account all of the earth's responses. Whatever way the climate adapted to past CO2 increases, whether through melting, changes in albedo, or other effects, is already reflected in the measured temperature.

Some climatologists, making assumptions about ever increasing rates of carbon dioxide production, assert that the doubling will occur within a few decades instead of a few centuries. However, they are doing sociology, not climatology. The only honest way to estimate the change of CO2 levels is by using a simple linear extrapolation; otherwise, we are merely indulging in speculation about future social trends. At the current rate of increase, CO2 will not double its current level until 2255.

Conclusions:

Although carbon dioxide is capable of raising the Earth's overall temperature, the IPCC's predictions of catastrophic temperature increases produced by carbon dioxide have been challenged by many scientists. In particular, the importance of water vapor is frequently overlooked by environmental activists and by the media.

The above discussion shows that the large temperature increases predicted by many computer models are unphysical and inconsistent with results obtained by basic measurements. Skepticism is warranted when considering computer-generated projections of global warming that cannot even predict existing observations.


{T. J. Nelson is a practicing research scientist at a non-profit institute affiliated with a large University. - Ed.}


Fibonacci's Golden Mean…Evidence of Design or Evolution?

by Allen Williams


We have often heard from evolutionists that those holding the view that life is a product of Intelligent Design are ‘Creobots’ or ‘ID’iots, as if there was nothing more than one’s emotions or a less evolved intellect in order to consider the possibility of design rather than natural selection.

Evolutionist, Richard Dawkins claims, that the ‘God delusion’ is behind the notion of design and purpose to mankind and its surroundings. Is this true? The field of mathematics is a highly structured and orderly application of human reason and logic, which did not evolve but was developed through incisive thought and mathematical tests.

The principles of mathematics developed through observation and the desire to quantify what was observed. It’s definitions, identities, and axioms define an absolute truth, which is unerring and undeviating, and is the basis that establishes the theorems of application. The foundation of Mathematics rests on logic and is a "deductive study of numbers, geometry and various abstract constructs". - Columbia Encyclopedia, 1991. So, how is the mental reasoning that allows truth to be discovered in the world of numbers, suddenly flawed when it comes to discovering the truth of an Intelligent Designer?

One evolutionist claims that "Real life is not binary, it uses real numbers (I’m not saying that binary numbers aren’t real, but that real life can use rational and irrational numbers). At the core, proofs in biology are not different from proofs in mathematics. "

Really? Well, then if real life can use irrational numbers to represent physical phenomena then it can equally utilize binary numbers, i.e. ‘0, 1’ to describe the state of certain life conditions. After all, what is life or death but a binary function? You’re either alive or dead, there isn’t any in-between. Now according to this evolutionist’s statement, if numbers in life forms can be shown to be real and part of a family, then that would constitute evidence for the existence of an Intelligent Designer as it would for the existence and validity of a mathematical theorem.

In Fermat’s Last Theorem, where equations of the form xn +yn = zn, have been shown to be irreducible and modular, therefore part of a family of elliptic curves. So, is there any biological evidence of some common mathematical relationship in living plants and animals? If not, then this would tend to support evolution’s notion of randomness in creating life but if so, then design is inferred.

An Italian mathematician by the name of Leonardo Fibonacci or more properly, Leonardo da Pisa brought the modern decimal numbering system to Europe in 1200 AD. His travels as a merchant throughout Algeria, Syria, Greece, etc., brought him into contact with the Arabian number or 10 based number system. He is best known for a simple mathematical series introduced in his publication, Liber Abaci, named as Fibonacci numbers in his honor. The series begins with zero and each successive number is the sum of the last two, i.e. 0 + 1 = 1 and 1 + 1 = 2, etc. 0, 1, 1, 2, 3 , 5 , 8, 13, 21, 34, 55 and so on defines the series.

A unique attribute of the series is that dividing a number by the preceding smaller one yields a ratio called the ‘golden mean’ or ‘divine number’, as it is called. It’s the ratio of the current number in the Fibonacci series, i.e. x(n+1) divided by the previous number x(n), i.e. 55/34 = 1.6176. This number series can be visualized as a group of boxes or rectangles as shown in Figure 1.

Figure 1 – Fibonacci Blocks

Now, consider the line segment, 'ac'


               a                                                b                                                         c


If we postulate that there exists an optimum ratio, i.e. height to width, which may be represented by the line segment above. We can pick an arbitrary point along the line to represent this optimum, called 'b', then: ab/bc = bc/ac

An optimum ratio is an ideal length relationship between line segments. Such a relationship might be expected to be more than simply an interesting intellectual exercise. The line segment ratio in our example is the length of bc to ab. Now if we set ab = 1 and let x represent the segment bc, we have:

1/x = x/(1+x)

(1 + x) = x2. This can be put into standard quadratic form as:

x2 – x –1 = 0 with a root of 1 + Sqrt(5)/2 = 1.618, or the golden ratio. The equation may also be expressed in terns of its coefficients:

x2 - (b/a)x – 1 = 0 where a = 1 and b= x.

The term (b/a) approaches a finite limit for the series as the Fibonacci ratio approaches infinity:

LIM [(b/a)] -> µ = 1.618. This relationship is also found to exist in angle groups as well as lines.

This remarkable property is by no means restricted to the series developed by Leonardo Fibonacci but occurs in limitless mathematical sequences. For example take the polar coordinate axes, 0, Pi/6, Pi/4, Pi/3, Pi/2, 2Pi/3, 3Pi/4, 5Pi/6, Pi, 7Pi/6, 5Pi/4, and so on. A Fibonacci series can be created from the polar axes themselves by adding together the last two numbers to get the next one in the sequence, which are: 0.79, 0.79, 1.84, 2.63, 4.47, 7.10, 11.57, 18.67, 30.24, 48.91, 79.15, and so on. A plot of the sequence is shown in Figure 2. 

This length to width relationship produces what is referred to as the ‘golden rectangle.’ It occurs time and again throughout nature, in the human body, as well as the animal and plant world. Note its similarity to both the Fibonacci spiral and the seashell of Figures 5 & 6 respectively.

"Phideas, the Greek sculptor and many others in Greece and Egypt used this relationship in creating their works of art and architecture.

Furthermore the ratio and the golden spiral are ubiquitous throughout creation", so says Frederick A. Willson in "Ubiquity of the Divine" (Golden) Ratio and Fibonacci Numbers Throughout the Heavens and Earth." See also: http://plus.maths.org/issue3/fibonacci

Figure 3 shows the plot of progressive (b/a) calculations of the polar axes as determined from the polar series of Figure 2. Note that this series limit is the predicted 1.618, the same ratio as our line segment, 'a-c' above demonstrates.

"The Golden Mean or Phi occurs frequently in nature and it may be that humans are genetically programmed to recognize the ratio as being pleasing. Studies of top fashion models revealed that their faces have an abundance of the 1.618 ratio." – Frederick A. Willson in "Ubiquity of the Divine." See: Fibonacci Numbers in nature

Human DNA

Human DNA exhibits this same divine number or golden mean. The double helix structure of DNA shown in Figure 4, measures 34 angstroms long by 21 angstroms wide for each full cycle. So, 34/21 = 1.619. The length between major and minor grooves, the blue and yellow line), exhibits this same 1.618 ideal ratio.

Human DNA: http://goldennumber.net/dna.htm

Figure 4. – Human DNA
(Photo: Double Helix)

Francis Crick, the discoverer of the double helix structure of DNA has calculated the probability that such a complicated molecular string could have formed on its own as essentially zero. The biological code written into the human cell would fill 1,600 pages.

There is as much information in the cell as is contained in 600 encyclopedias. Cell mechanics are also extremely complex: "Life at the molecular level is composed of tiny machinesThere are more machines inside the human body than all the combined factories of the world… There are 2 trillion chemical processes that take place in the human cell." - Michael Behe, ‘Darwin's Black Box’.

Genetics has shown that mankind is all ONE blood (Acts 17:26) requiring no more than a 0.012% variation in DNA to account for all the outward physical characteristics, including skin color.

It seems a stretch to the logical mind that evolution not only produces life by random acts of chance but has also produced it in an optimal aspect ratio of length to width, i.e. 1.618: 1. In one case, natural selection's chance combinations purportedly create life through confusion, while on the other hand, we have evidence of Intelligent Design as noted in I Corinthians 14:33, which states that "For God is not the author of confusion.."

Intelligent Design’s ‘Anthropic Principle’ suggests that there are a series of optimum constants necessary to support life, one of which is the spiral galaxy. There are three basic galactic forms, elliptical, irregular and spiral. However, only spiral galaxies have a near round orbit capable of supporting life.

Figure 5 – Spiral Galaxy
(Photo: Ursa Major)

The golden mean in spiral galaxies: http://www.wisdomportal.com/Geometry/SpiralPage.html

Many plants exhibit the Fibonacci series in petals, leaves, sections and seeds: http://goldennumber.net/plants.htm Fibonacci numbers are also found in living organisms including the cochlea of the human ear: http://www.world-mysteries.com/sci_17.htm

Figure 6 - Fibonacci Spiral
(Photo: Nautilus Construct)

Using the blocks of Figure 1, we may construct an idealized spiral by drawing a series of lines through the diagonals of each of the Fibonacci squares and then connecting them with a series of smooth curves or chords as shown in Figure 5.

The resulting figure looks remarkably like the Nautilus seashell of Figure 7. 

Figure 7: Nautilus Shell (Photo: http://www.sacredarch.com/sacred_geo_exer_shell.htm )


Matter, Chance and Time are the trinity of evolution. And so, unique mathematical relationships conflict with material science’s assertion that there are no absolute truths. The fact that a unique aspect ratio is so pervasive through out nature is strong evidence of intentional design rather than chance combinations in matter as evolutionist’s have asserted.