Archive for July, 2008

>Announcing the death of Climate Science as we know it

>It’s fair to say that the science underpinning the proposition that mankind is responsible for the warming the planet has experienced over the last hundred or so years has copped a battering.

Less battered in the public arena are political operatives, the Climate Liars, such as the IPCC, Al Gore, James Hansen, the Hockey Stick team and Big Green environmental groups who currently hold public opinion in their grasp.

Europe is already suffering the economic pain inflicted on its people by the travesty of pandering to misanthropic green groups with the unilateral introduction of carbon trading laws.

While the extent and cause can be argued, it’s also clear that many developing nations are suffering the unintended consequence of higher food prices brought about by the use of ethanol as a fuel substitute.

Climate change policies have as their platform the output of a variety of climate models. As I’ve pointed out previously, these models have a zero percent prediction rate, which makes one wonder how history will view the politicians who used them to further their political goals when the consequences are so dire.

So far, climate science resembles astrology more closely than real science and climate models’ predictions are closer to horoscopes than anything meaningful.

If climate science is indeed real science then it should make predictions that can be tested. As Karl Popper tells us, the essence of science is falsifiability. If it can be falsified then it is wrong.

So, taking a purely scientific approach to the analysis of climate science is there anything that kills the current predictions of climate science?

As it happens there is.

Before I go to that I’ll provide a similar example of a theory making a prediction that could be tested. A negative result would have killed the theory stone dead. A positive result didn’t make the theory fact but maintained the theory’s ongoing relevance.

That was the Big Bang Theory.

If you’ve read George Smoot’s book Wrinkles In Time then you’ll be familiar with the story.

In order for the big bang theory to be correct there must exist a background radiation signature that could be detected. Smoot, along with John Mather, led a team of scientists in the search for the signal. In 1989 NASA launched its Cosmic Background Explorer (COBE) satellite.

After more than two years of observation and analysis, the COBE research team announced on 23 April 1992 that the satellite had detected tiny fluctuations in the CMB, a breakthrough in the study of the early universe.

Thus, a key prediction of the theory was shown to be true. For their work Smoot and Mather won the Nobel Prize in Physics in 2006.

In order for climate models to be correct – and thus the climate science upon which they are based to be correct – they must make predictions that can be tested.

The most important is that of an increased greenhouse effect is a hotspot in the atmosphere over the tropics.

In Dr David Evans’ recent paper The Missing Greenhouse Signature he states:

Each possible cause of global warming has a different pattern of where in the planet the warming occurs first and the most. The signature of an increased greenhouse effect is a hotspot about 10 km up in the atmosphere over the tropics.

We have been measuring the atmosphere for decades using radiosondes—weather balloons with thermometers that radio back the temperature as the balloon ascends through the atmosphere. They show no hotspot whatsoever.

So an increased greenhouse effect is not the cause of the recent global warming. So we now know for sure that carbon emissions are not a significant cause of the global warming.

To support this claim he provides the following evidence:

The theoretical signatures come from the latest big report from the IPCC, which is the most authoritative document for those who believe carbon emissions caused global warming. The IPCC Assessment Report 4 (AR4), 2007, Chapter 9. Figure 9.1, in Section, page 675, shows six greenhouse signature diagrams. (

In each diagram the horizontal axis is the latitude, from the north pole (90 degrees north) through the equator to the south pole (90 degrees south). The vertical axis shows the height in the atmosphere, marked on left hand side shown as 0 – 30 km (and on the right hand side as the corresponding air pressures in hPa). The coloured regions on each diagram shows where the temperature changes occur for each possible cause (red +1°C, yellow +0.5°C, green −0.5°C, blue −1°C per century).

…The other main authoritative source for the case that carbon emissions caused global warming is the US Climate Change Science Program (CCSP). Atmospheric temperatures have been measured by radiosondes (at all heights) since the 1960s, andby satellites using microwave sensors (up to 5 km) since 1979. The CCSP published the results for 1979 – 1999 in part E of Figure 5.7 in section 5.5 on page 116:

The axes and colours are as per the signature diagrams above, except that the horizontal axis only goes from 75 degrees north to 75 degrees south, there is no data
around 60 degrees south, the vertical axis only goes up to 24 km, and dark blue above becomes purple here. The data is called the “HadAT2 temperature data”.

This diagram is confirmed by more radiosonde data collected after 1999, and also after May 2006 when this diagram was published.

Evans concludes:

The theoretical combined signature expected by the IPCC contains a prominent and distinct hotpot over the tropics at 8 – 12 kms. This hotspot is the signature feature of an increase in greenhouse warming.

The observed signature at 8 – 12 km up over the tropics does not contain a hotspot, not even a little one. Therefore:

  1. The IPCC theoretical signature is wrong. So the IPCC models are significantly wrong.
  2. The signature of increased greenhouse warming is missing. So the global warming from 1979 to 1999 was not due predominately to increased greenhouse warming, and was therefore not due to carbon emissions.

The observed signature shows cooling above 16 km, which strongly suggests that the global warming was not due to increased solar irradiation, volcanoes, or increased industrial pollution (aerosols). The observed signature looks like a combination of increased ozone depletion, possibly a decrease in industrial pollution, and an unknown signature or signatures.

When the signature was found to be missing, alarmists objected that maybe the readings of the radiosonde thermometers might not be accurate and maybe the hotspot is there but went undetected. The uncertainties in temperature measurements from a radiosonde are indeed large enough for a single radiosonde to maybe miss the hotspot. Yet hundreds of radiosondes have given the same answers, so statistically it is not possible that they collectively failed to notice the hotspot.

Recently the alarmists have suggested we ignore the radiosonde thermometers, but instead take the radiosonde wind measurements, apply a theory about wind shear, and run the results through their computers to estimate the temperatures. They then say that the results show that we cannot rule out the presence of a hotspot. If you believe that you believe anything.

Climate science makes a key prediction that can be tested.

When tested the prediction is shown to be false.

Therefore, climate science is shown to be false.

Climate Science is dead. Long may it rot in hell.

Long live real science.

(Nothing Follows)

Categories: Climate Change

>If it wasn’t so serious then it’d be funny

>If the issue of climate change was not as serious as it is then the ongoing destruction of the Climate House Of Cards would be hilarious.

I have said for ages that climate models have not been accurate ever. Demetris Koutsoyiannis had previously shown this to be the case and goes into even more detail about how bad the models are in his new paper, which is well worth a read.

The summary of his paper is:

Geographically distributed predictions of future climate, obtained through climate models, are widely used in hydrology and many other disciplines, typically without assessing their reliability. Here we compare the output of various models to temperature and precipitation observations from eight stations with long (over 100 years) records from around the globe. The results show that models perform poorly, even at a climatic (30-year) scale. Thus local model projections cannot be credible, whereas a common argument that models can perform better at larger spatial scales is unsupported.

Given the paucity of data supporting the proposition that CO2 is the main driver of climate change and the models’ complete inability to forecast accurately one would think that the issue would fade from public view.

Au contraire, mon ami.

There is one little problem with that conclusion.

Politics trumps science.

In the 1930s Stalin allowed the implementation of Trofim Lysenko’s disastrous agricultural policies. In James Hansen the 21st century has its own Lysenko.

It should come as no surprise that politicians use science for their own ends and, in the event that the science is falsified or doesn’t support the (typically) imposition of new taxes on the public at large, they simply do the three monkeys thing and ignore the inconvenience.

That’s why we’re in so much danger here in Australia. Both the Labor government and pissweak Coalition opposition favour the introduction of an emissions trading scheme.

For gawdsake – it snowed in Sydney the other day; the coldest Sydney day for over 40 years. Can’t these politicians stick their head out the window for a brief moment to discover the truth. Are their minds so closed to scientific reality?

Apparently so.

The Australian population is yet to understand that the emissions trading scheme is simply another tax on them. Governments around the world are using business as a proxy for the collection of more tax dollars. Any cost increase to inputs to business simply get passed straight on to the consumer.

There is no difference to the average taxpayer between introducing an emissions trading scheme and increasing the GST from 10% to around 15% in the short term and 30%-40% in order to hit the targets outlined in the Garnaut Report.

By introducing the emissions trading scheme the government is telling Australian Working Families that it is prefers to support China’s and India’s economic growth and standard of living over our own.

Western apologists are putting the argument that India and China need the chance to ‘catch up’ in terms of economic development and that, after all, it was the West that created the problem. This is certainly the position being taken by those countries and other developing nations such as Brazil.

The problem with that logic is that these countries had the same opportunity to grow their economies since 1950 but chose instead to implement the Marxist economic policies that impoverished all but the fortunate few at the top.

India and China invested no money at all into the development of the technologies that they are now able to take advantage of with their new found wealth. The West paid the carbon price for that.

Why not charge them a ‘catch up’ amount that would be in the same order as, say, a carbon emissions scheme?

Perhaps Australia’s Mandarin speaking Prime Mandarin doesn’t want to get offside those people for whom he obviously has more regard than he does for Australian Working Families.

As I said, if it wasn’t so serious then it’d be funny.

(Nothing Follows)

Categories: Climate Change

>More on Peak Oil hooha

>In the picture tells the story category come the following graphs courtesy of the always informative Peak Oil Debunked.

Where’s the problem?

(Nothing Follows)

Categories: Energy

>Climate data fudged, consensus breaking down. Politicians still ignore reality.

>The great scandal of climate science is not only the manipulation of data being undertaken by people like James Hansen but also the failure of the mainstream climate community to make public their research data in spite of the fact that it’s been created using billions of dollars of the public’s money.

The Climate Fortress attitude is best summed up by Phil Jones’ response to an enquiry from Steve McIntyre:

We have 25 or so years invested in the work. Why should I make the data available to you, when your aim is to try and find something wrong with it. There is IPR to consider.

Intellectual property rights have something to do with the accuracy of science? Jones has had his snout in the public trough for 25 years and his data is not available?

Jones is afraid of someone finding something wrong with his data?

What sort of scientist is afraid of the truth?

A climate scientist, of course.

These people will be remembered along with Lysenko and Hwang as modern day scientific shysters.

The facade that is climate science is breaking down under an increasing barrage of analysis being done by scientists who are no longer afraid to speak out, as the UK Telegraph’s Christopher Brooker highlights.

Considering that the measures recommended by the world’s politicians to combat global warming will cost tens of trillions of dollars and involve very drastic changes to our way of life, it might be thought wise to check the reliability of the evidence on which they base their belief that our planet is actually getting hotter.

There are four internationally recognised sources of data on world temperatures, but the one most often cited by supporters of global warming is that run by James Hansen of Nasa’s Goddard Institute for Space Studies (GISS).

Hansen has been for 20 years the world’s leading scientific advocate of global warming (and Al Gore’s closest ally). But in the past year a number of expert US scientists have been conducting a public investigation, through scientific blogs, which raises large question marks over the methods used to arrive at his figures.

First they noted the increasingly glaring discrepancy between the figures given by GISS, which show temperatures continuing to race upwards, and those given by the other three main data sources, which all show temperatures having fallen since 1998, dropping dramatically in the past year to levels around the average of the past 30 years.

Two sets of data, from satellites, go back to 1979: one produced by Dr Roy Spencer, formerly of Nasa, now at the University of Alabama, Huntsville, the other by Remote Sensing Systems. Their figures correspond closely with those produced by the Hadley Centre for Climate Studies of our own Met Office, based on global surface temperature readings.

Right out on their own, however, are the quite different figures produced by GISS which, strangely for a body sponsored by Nasa, rely not on satellites but also on surface readings. Hansen’s latest graph shows temperatures rising since 1880, at accelerating speed in the past 10 years.

The other three all show a flattening out after 2001 and a marked downward plunge of 0.6 degrees Celsius in 2007/8, equivalent to almost all the net warming recorded in the 20th century. (For comparisons see “Is the Earth getting warmer, or colder?” by Steven Goddard on The Register website.)

Even more searching questions have been raised over Hansen’s figures by two expert blogs. One is Climate Audit, run by Steve McIntyre, the computer analyst who earlier exposed the notorious “hockeystick” graph that was shamelessly exploited by the Intergovernmental Panel on Climate Change and Al Gore. (This used a flawed computer model to suppress evidence that the world was hotter in the Middle Ages than today.) The other site is Watts Up With That, run by the meteorologist Anthony Watts.

It was McIntyre who last year forced Hansen to publish revised figures for US surface temperatures, to show that the hottest years of the 20th century were not in the 1990s, as Hansen had claimed, but in the 1930s. He has now shown that Hansen had been adjusting almost all his pre-1970 global temperature figures downwards, by as much as 0.5 degrees, and his post-1970 figures upwards.

Although Hansen claimed that this only resulted from more careful calculations, McIntyre pointed out how odd it was that the adjustments all seemed to confirm his thesis.

Watts meanwhile has also been conducting an exhaustive photographic survey of US surface weather stations, showing how temperature readings on more than half have been skewed upwards by siting thermometers where their readings are magnified by artificial heat-sources, such as asphalt car parks or air-conditioning systems.

All this has raised such doubts over the methodology behind the GISS data that informed observers are calling for it to be independently assessed. Hansen himself is notoriously impatient of any criticism of his methods: earlier this month he appealed to Congress that the leaders of those who question global warming should be put on trial.

It is still too early to suggest that the recent drop in temperatures shown by everyone but him is proof that global warming has stopped. But the fact is that not one of those vaunted computer models predicted what has happened to temperatures in recent years. Yet it is on those models (and Hansen’s alarmist figures) that our politicians are basing all their proposals for irrevocably changing our lives.

(Nothing Follows)

Categories: Climate Change

>Sunday night rock ‘n’ roll

>Jesus Jones is a British London-based rock group that recorded and performed in the late 1980s, throughout the 1990s, and into the 2000s.

Incorporating elements of electronic music styles such as house and techno to an indie rock format, along with fellow British groups such as The Shamen, Pop Will Eat Itself and EMF, Jesus Jones were one of the leading purveyors of the early 1990s “indie dance” scene. The band is led by Mike Edwards.

They achieved initial critical acclaim with their 1989 album Liquidizer, and in particular, the single “Info Freako”, which featured buzzing rock guitars with samples and a hip-hop sensibility, relatively new for the time. The track was particularly championed by Bruno Brookes on his Radio 1 evening show.

In the spring of 1990, Jesus Jones recorded their second album, Doubt, but their label was forced to delay its release until the beginning of 1991. The album sold very well, due to the success of their best-known hit “Right Here, Right Now”. The song is about the swift end of the Cold War, and was a No. 2 hit in the U.S. but reached only No. 31 in the UK; and which was resurrected in 2006 as an advertising jingle for the American retailer Kmart, an image campaign for CBS News, and was used in promotional advertisements for the now defunct TV channel, TechTV.

Other hit singles from the Doubt album included “Real, Real, Real” and “International Bright Young Thing”. In the year that Doubt was released, Jesus Jones won the “Best Newcomer” award at the MTV Awards.

The follow-up to Doubt was Perverse which, although a big seller, did not reach the worldwide hit status of Doubt.

International Bright Young Thing

Real, Real, Real

Right Here, Right Now

(Nothing Follows)

Categories: Music

>The lie of blistering temperatures in Sydney, Melbourne and Adelaide

July 26, 2008 6 comments

>In an article published in The Australian it is claimed that extreme temperatures of more than 50C (122F) will become part of the climate landscape for the major Australian cities of Sydney, Melbourne and Adelaide.

To call this claim ‘scientific research’ is to misuse the term completely.

How does it come about that such extreme temperatures could be predicted?

Read the article first…

MELBOURNE, Adelaide and Sydney will blister in temperatures of more than 50C by 2050, according to the first hard look at the impact of climate change on extreme weather.

The forecast is part of a long-term prediction that temperatures on the hottest day of the year will rise dramatically in parts of southern Australia, including the southern Murray-Darling Basin, much of coastal NSW, Victoria and South Australia.

But the study did not find evidence that other parts of Australia would be so severely affected.

“No one’s ever looked at these numbers before,” said Andy Pitman, co-director of the University of NSW Climate Change Research Centre in Sydney.

Scientists with the CSIRO and the Australian Bureau of Meteorology have also assessed the nation’s future climate but they focused on average changes in extremes of temperature and rainfall due to climate change.

Along with graduate student Sarah Perkins, Professor Pitman analysed daily temperatures. “There is nothing wrong with what they did, but they missed that last bit of evidence that identified the ‘extreme’ extremes,” Professor Pitman said.

The researchers first tested the effectiveness of many climate modelling systems by “hind-casting”, testing how well they predicted past conditions.

After identifying the most reliable models, they simulated daily changes in temperature and rainfall as greenhouse gases increased in the atmosphere. They found the increase altered the pattern of warming for rare super-hot days.

To their surprise, there was also an indirect effect. Global warming led to a reduction in rainfall which, in turn, reduced evaporation. “If there’s less evaporation, the land surface becomes hotter, a process known as positive feedback,” Professor Pitman said.

That is why extreme events in places such as Darwin and Perth did not outpace those in the south: there’s no feedback there.

…the research comes from analysing models and how they performed when hind-casting. Taking the most accurate models, as stated in the article, the models are then run forward in time to 2050 and, lo and behold, 50C is the answer.

Climate models are created by programming a number of algorithms related to well understood atmospheric physics. Past temperatures are then assessed against a model’s output and adjustments are made until the model represents the historical record accurately.

For example, while CO2 rose from 1940 to 1975 the world’s temperature fell. How did model makers overcome this inconvenience? By ascribing a cooling effect to particulate matter. What parameter was assigned to this cooling effect? Exactly enough to make their models correct. How convenient.

Adjustments like this are going on every day in every model.

Now, if you take one of these models and run it so that it produces a forecast then the reliability must be extremely questionable given that the fudges are statistically invalid.

Unsurprisingly, no model has ever accurately forecast the world’s climate.

For these ‘researchers’ to take the model that has the best fudges so that they can hind-cast and then rely on the output to claim that Sydney, Melbourne and Adelaide are in for 50C days can only be called one thing.

A lie.

Why a lie? Why not wrong?

Because these researchers know full well that models have no predictive ability and are created by a series of fudges.

Therefore, they knew what the result would be before they began their ‘research’.

The Climate Faithful have, in my view, moved from being simply wrong to being downright liars.

They ignore the mounting evidence that they’re wrong and are now screaming even more loudly that we’re all doomed.

The scientific battle is all but over. Unfortunately, the political battle still rages.

Let’s hope that in Australia we can avoid the destruction of our global competitiveness by the planned introduction of an emissions trading scheme.

(Nothing Follows)

Categories: Australia, Climate Change

>Education and the upper 50% IQ

July 25, 2008 2 comments

>Charles Murray continues his look at why educational outcomes are not what society expects.

As I mentioned yesterday, how many politicians are going to take on the reality of IQ’s effect on education when they can take the easier opyion of tax and spend instead?

Many people are of the belief that smaller class sizes equals better outcomes. This may be true in certain circumstances but worldwide research shows that class size is a low predictor of educational outcome for a student. Teacher quality is the strongest predictor, as demonstrated by countries like Finland, which has large class sizes and teachers for whom it is mandatory to have a masters degree.

Smaller class size is pushed heavily be education unions, as it results in more teachers. The real effect is to lower teaching standards, as there are more salaries to be paid to teachers out of a bucket that does not increase proportionally in size.

The other interesting thing to note is that IQ tests given to 8 year olds – adjusted for age etc – have a 75%-80% success rate at predicting high school outcomes, which supports the argument that Charles Murray makes:

The topic yesterday was education and children in the lower half of the intelligence distribution. Today I turn to the upper half, people with IQs of 100 or higher. Today’s simple truth is that far too many of them are going to four-year colleges.

Begin with those barely into the top half, those with average intelligence. To have an IQ of 100 means that a tough high-school course pushes you about as far as your academic talents will take you. If you are average in math ability, you may struggle with algebra and probably fail a calculus course. If you are average in verbal skills, you often misinterpret complex text and make errors in logic.

These are not devastating shortcomings. You are smart enough to engage in any of hundreds of occupations. You can acquire more knowledge if it is presented in a format commensurate with your intellectual skills. But a genuine college education in the arts and sciences begins where your skills leave off.

In engineering and most of the natural sciences, the demarcation between high-school material and college-level material is brutally obvious. If you cannot handle the math, you cannot pass the courses. In the humanities and social sciences, the demarcation is fuzzier. It is possible for someone with an IQ of 100 to sit in the lectures of Economics 1, read the textbook, and write answers in an examination book. But students who cannot follow complex arguments accurately are not really learning economics. They are taking away a mishmash of half-understood information and outright misunderstandings that probably leave them under the illusion that they know something they do not. (A depressing research literature documents one’s inability to recognize one’s own incompetence.) Traditionally and properly understood, a four-year college education teaches advanced analytic skills and information at a level that exceeds the intellectual capacity of most people.

There is no magic point at which a genuine college-level education becomes an option, but anything below an IQ of 110 is problematic. If you want to do well, you should have an IQ of 115 or higher. Put another way, it makes sense for only about 15% of the population, 25% if one stretches it, to get a college education. And yet more than 45% of recent high school graduates enroll in four-year colleges. Adjust that percentage to account for high-school dropouts, and more than 40% of all persons in their late teens are trying to go to a four-year college–enough people to absorb everyone down through an IQ of 104.

No data that I have been able to find tell us what proportion of those students really want four years of college-level courses, but it is safe to say that few people who are intellectually unqualified yearn for the experience, any more than someone who is athletically unqualified for a college varsity wants to have his shortcomings exposed at practice every day. They are in college to improve their chances of making a good living. What they really need is vocational training. But nobody will say so, because “vocational training” is second class. “College” is first class.

Large numbers of those who are intellectually qualified for college also do not yearn for four years of college-level courses. They go to college because their parents are paying for it and college is what children of their social class are supposed to do after they finish high school. They may have the ability to understand the material in Economics 1 but they do not want to. They, too, need to learn to make a living–and would do better in vocational training.

Combine those who are unqualified with those who are qualified but not interested, and some large proportion of students on today’s college campuses–probably a majority of them–are looking for something that the four-year college was not designed to provide. Once there, they create a demand for practical courses, taught at an intellectual level that can be handled by someone with a mildly above-average IQ and/or mild motivation. The nation’s colleges try to accommodate these new demands. But most of the practical specialties do not really require four years of training, and the best way to teach those specialties is not through a residential institution with the staff and infrastructure of a college. It amounts to a system that tries to turn out televisions on an assembly line that also makes pottery. It can be done, but it’s ridiculously inefficient.

Government policy contributes to the problem by making college scholarships and loans too easy to get, but its role is ancillary. The demand for college is market-driven, because a college degree does, in fact, open up access to jobs that are closed to people without one. The fault lies in the false premium that our culture has put on a college degree.

For a few occupations, a college degree still certifies a qualification. For example, employers appropriately treat a bachelor’s degree in engineering as a requirement for hiring engineers. But a bachelor’s degree in a field such as sociology, psychology, economics, history or literature certifies nothing. It is a screening device for employers. The college you got into says a lot about your ability, and that you stuck it out for four years says something about your perseverance. But the degree itself does not qualify the graduate for anything. There are better, faster and more efficient ways for young people to acquire credentials to provide to employers.

The good news is that market-driven systems eventually adapt to reality, and signs of change are visible. One glimpse of the future is offered by the nation’s two-year colleges. They are more honest than the four-year institutions about what their students want and provide courses that meet their needs more explicitly. Their time frame gives them a big advantage–two years is about right for learning many technical specialties, while four years is unnecessarily long.

Advances in technology are making the brick-and-mortar facility increasingly irrelevant. Research resources on the Internet will soon make the college library unnecessary. Lecture courses taught by first-rate professors are already available on CDs and DVDs for many subjects, and online methods to make courses interactive between professors and students are evolving. Advances in computer simulation are expanding the technical skills that can be taught without having to gather students together in a laboratory or shop. These and other developments are all still near the bottom of steep growth curves. The cost of effective training will fall for everyone who is willing to give up the trappings of a campus. As the cost of college continues to rise, the choice to give up those trappings will become easier.

A reality about the job market must eventually begin to affect the valuation of a college education: The spread of wealth at the top of American society has created an explosive increase in the demand for craftsmen. Finding a good lawyer or physician is easy. Finding a good carpenter, painter, electrician, plumber, glazier, mason–the list goes on and on–is difficult, and it is a seller’s market. Journeymen craftsmen routinely make incomes in the top half of the income distribution while master craftsmen can make six figures. They have work even in a soft economy. Their jobs cannot be outsourced to India. And the craftsman’s job provides wonderful intrinsic rewards that come from mastery of a challenging skill that produces tangible results. How many white-collar jobs provide nearly as much satisfaction?

Even if forgoing college becomes economically attractive, the social cachet of a college degree remains. That will erode only when large numbers of high-status, high-income people do not have a college degree and don’t care. The information technology industry is in the process of creating that class, with Bill Gates and Steve Jobs as exemplars. It will expand for the most natural of reasons: A college education need be no more important for many high-tech occupations than it is for NBA basketball players or cabinetmakers. Walk into Microsoft or Google with evidence that you are a brilliant hacker, and the job interviewer is not going to fret if you lack a college transcript. The ability to present an employer with evidence that you are good at something, without benefit of a college degree, will continue to increase, and so will the number of skills to which that evidence can be attached. Every time that happens, the false premium attached to the college degree will diminish.

Most students find college life to be lots of fun (apart from the boring classroom stuff), and that alone will keep the four-year institution overstocked for a long time. But, rightly understood, college is appropriate for a small minority of young adults–perhaps even a minority of the people who have IQs high enough that they could do college-level work if they wished. People who go to college are not better or worse people than anyone else; they are merely different in certain interests and abilities. That is the way college should be seen. There is reason to hope that eventually it will be.

(Nothing Follows)

Categories: Education, United States

>Education and the lower 50% IQ

July 24, 2008 1 comment

>While Charles Murray of the American Enterprise Institute correctly notes societal issues caused by low IQ it’s probably fair to say that political-correctness and the assumption that more education will create better outcomes are too difficult to overcome in order to achieve benefits for society.

Education is becoming the preferred method for diagnosing and attacking a wide range problems in American life. The No Child Left Behind Act is one prominent example. Another is the recent volley of articles that blame rising income inequality on the increasing economic premium for advanced education. Crime, drugs, extramarital births, unemployment–you name the problem, and I will show you a stack of claims that education is to blame, or at least implicated.

One word is missing from these discussions: intelligence. Hardly anyone will admit it, but education’s role in causing or solving any problem cannot be evaluated without considering the underlying intellectual ability of the people being educated. Today and over the next two days, I will put the case for three simple truths about the mediating role of intelligence that should bear on the way we think about education and the nation’s future.

Today’s simple truth: Half of all children are below average in intelligence. We do not live in Lake Wobegon.

Our ability to improve the academic accomplishment of students in the lower half of the distribution of intelligence is severely limited. It is a matter of ceilings. Suppose a girl in the 99th percentile of intelligence, corresponding to an IQ of 135, is getting a C in English. She is underachieving, and someone who sets out to raise her performance might be able to get a spectacular result. Now suppose the boy sitting behind her is getting a D, but his IQ is a bit below 100, at the 49th percentile.

We can hope to raise his grade. But teaching him more vocabulary words or drilling him on the parts of speech will not open up new vistas for him. It is not within his power to learn to follow an exposition written beyond a limited level of complexity, any more than it is within my power to follow a proof in the American Journal of Mathematics. In both cases, the problem is not that we have not been taught enough, but that we are not smart enough.

Now take the girl sitting across the aisle who is getting an F. She is at the 20th percentile of intelligence, which means she has an IQ of 88. If the grading is honest, it may not be possible to do more than give her an E for effort. Even if she is taught to read every bit as well as her intelligence permits, she still will be able to comprehend only simple written material. It is a good thing that she becomes functionally literate, and it will have an effect on the range of jobs she can hold. But still she will be confined to jobs that require minimal reading skills. She is just not smart enough to do more than that.

How about raising intelligence? It would be nice if we knew how, but we do not. It has been shown that some intensive interventions temporarily raise IQ scores by amounts ranging up to seven or eight points. Investigated psychometrically, these increases are a mix of test effects and increases in the underlying general factor of intellectual ability–“g.” In any case, the increases fade to insignificance within a few years after the intervention. Richard Herrnstein and I reviewed the technical literature on this topic in “The Bell Curve” (1994), and studies since then have told the same story.

There is no reason to believe that raising intelligence significantly and permanently is a current policy option, no matter how much money we are willing to spend. Nor can we look for much help from the Flynn Effect, the rise in IQ scores that has been observed internationally for several decades. Only a portion of that rise represents an increase in g, and recent studies indicate that the rise has stopped in advanced nations.

Some say that the public schools are so awful that there is huge room for improvement in academic performance just by improving education. There are two problems with that position. The first is that the numbers used to indict the public schools are missing a crucial component. For example, in the 2005 round of the National Assessment of Educational Progress (NAEP), 36% of all fourth-graders were below the NAEP’s “basic achievement” score in reading. It sounds like a terrible record. But we know from the mathematics of the normal distribution that 36% of fourth-graders also have IQs lower than 95.

What IQ is necessary to give a child a reasonable chance to meet the NAEP’s basic achievement score? Remarkably, it appears that no one has tried to answer that question. We only know for sure that if the bar for basic achievement is meaningfully defined, some substantial proportion of students will be unable to meet it no matter how well they are taught. As it happens, the NAEP’s definition of basic achievement is said to be on the tough side. That substantial proportion of fourth-graders who cannot reasonably be expected to meet it could well be close to 36%.

The second problem with the argument that education can be vastly improved is the false assumption that educators already know how to educate everyone and that they just need to try harder–the assumption that prompted No Child Left Behind. We have never known how to educate everyone. The widely held image of a golden age of American education when teachers brooked no nonsense and all the children learned their three Rs is a myth. If we confine the discussion to children in the lower half of the intelligence distribution (education of the gifted is another story), the overall trend of the 20th century was one of slow, hard-won improvement. A detailed review of this evidence, never challenged with data, was also part of “The Bell Curve.”

This is not to say that American public schools cannot be improved. Many of them, especially in large cities, are dreadful. But even the best schools under the best conditions cannot repeal the limits on achievement set by limits on intelligence.

To say that even a perfect education system is not going to make much difference in the performance of children in the lower half of the distribution understandably grates. But the easy retorts do not work. It’s no use coming up with the example of a child who was getting Ds in school, met an inspiring teacher, and went on to become an astrophysicist. That is an underachievement story, not the story of someone at the 49th percentile of intelligence. It’s no use to cite the differences in test scores between public schools and private ones–for students in the bottom half of the distribution, the differences are real but modest. It’s no use to say that IQ scores can be wrong. I am not talking about scores on specific tests, but about a student’s underlying intellectual ability, g, whether or not it has been measured with a test. And it’s no use to say that there’s no such thing as g.

While concepts such as “emotional intelligence” and “multiple intelligences” have their uses, a century of psychometric evidence has been augmented over the last decade by a growing body of neuroscientific evidence. Like it or not, g exists, is grounded in the architecture and neural functioning of the brain, and is the raw material for academic performance. If you do not have a lot of g when you enter kindergarten, you are never going to have a lot of it. No change in the educational system will change that hard fact.

That says nothing about the quality of the lives that should be open to everyone across the range of ability. I am among the most emphatic of those who think that the importance of IQ in living a good life is vastly overrated. My point is just this: It is true that many social and economic problems are disproportionately found among people with little education, but the culprit for their educational deficit is often low intelligence. Refusing to come to grips with that reality has produced policies that have been ineffectual at best and damaging at worst.

(Nothing Follows)

Categories: Education, United States

>ABS releases annual report on social trends.

>Non-interesting results from the latest Australian Bureau of Statistics annual report on social trends, which was released today.

Australians are better educated, hard-drinking and childless car addicts with massive mortgages and a growing penchant for alternative medicines, according to a new statistical snapshot.

Which just goes to show that there are lies, damned lies and statistics.

I do not know one person who is well educated, drinks hard, has no kids, is addicted to his car, has a massive mortgage and trots off to Nimbin every few months to catch up on the latest alternative medicine trends.

The Australian Bureau of Statistics (ABS) annual report on social trends says the number of people aged 25 to 64 with a degree, diploma or certificate jumped from 46 to 59 per cent between 1996 and 2006.

What’s that show? That people are achieving at a higher rate or that standards have been lowered? I’ll give you some time to think about that…


But despite better qualifications, almost half of all Australians aged 15 to 74 had literacy skills below the minimum level required to “meet the complex demands of a knowledge society”, the report said.

Taaaadaaaa! An increased number of people with formal qualifications are in the market but almost half lack relevant literacy skills. The education unions and self-serving establishment must be jumping for joy at having achieved such terrific results. Not.

The report also looked into fertility, showing the number of women of peak child-bearing age without kids is on the rise.

In 2006, 37 per cent of women aged 30 to 34 had no children, increasing from 29 per cent in 1996.

Sounds like the Eurofication of Australia to me. Single women are large Labor voters. Female Labor voters, like their left wing counterparts the world over, breed at a lower rate than their conservative sisters.

The report found Australians are still lukewarm on public transport, with just one in five adults using public transport to get to work or study in 2006.

The only decent public transport in Australia is in Melbourne, especially if you live close to the city.

Three-quarters used cars as their main form of transport, while five per cent either walked or cycled.

Australia is a big place. People live a long way from work. You can’t expect them to sit on a train for two hours to get to work.

The report also looked into home ownership, finding the amount first home buyers borrowed to buy a house doubled in the 10 years to 2005-06.

The average loan per household rose to $213,000, as the average value of first-buyer homes rose to $310,000.

The proportion of first-home buyers purchasing new houses fell to 14 per cent, from 23 per cent.

The report gave further credence to claims the country is beset with a binge-drinking culture, finding about one in five men and one in six women aged 18 to 24 regularly drank risky amounts of alcohol in 2007.

What is a ‘risky amount’ of alcohol? Is it enough to be a health risk? Let’s find out…

While very few young people drank enough to be admitted to hospital, hospitalisation rates were up by 62 per cent for young men and doubled for young women in the seven years to 2005-06.

…I think that someone is trying to support the government’s insane attack on the non-problem of binge-drinking.

The number of people visiting a “complementary health professional” such as a chiropractor, naturopath or acupuncturist increased by 51 per cent in the decade to 2005, the report also found.

A 51% increase could mean that the number moved from 2 in 10,000 to 3 in 10,000. What’s the real number?

And for the first time more households are using broadband than dial-up to connect to the internet.

In the eight years to 2006-07, household internet connections jumped from 16 per cent to 64 per cent, with broadband accounting for between 50-75 per cent of connections.

The report also found trade union membership rates have almost halved overall in the two decades since 1986.

So it should. Trade unions have been a blight on the Australian economy for 50 years.

Nothing really too interesting in the figures but I’m sure that governments around Australia will find a way to use them to increase public ‘investment’ aka taxes.

(Nothing Follows)

Categories: Australia

>Scorching Heat vs Devastating Cold – the battle rages

July 21, 2008 2 comments

>The August 2007 edition of New Scientist includes the following article:

Prepare for another ten scorching years

Temperature records will be repeatedly shattered over the next few years, say researchers behind the first rigorous look at how global climate will change during the next decade.

The prediction comes from an innovative technique that combines the approaches used by weather forecasters, who typically look a few days ahead, and climate modellers, who produce projections that run up to the end of the century. The result is a model that can project as far as 2015, filling in a long-standing gap in climate predictions.

Although average global temperatures have been relatively flat in recent years, the model says they will start rising again next year. At least half of the years between 2009 and 2015 will exceed the current warmest year on record. By 2015, global temperatures will be 0.5 °C above the average value for the last 30 years.

“This is a very important paper,” says Rong Zhang, an oceanographer at the Geophysical Fluid Dynamics Laboratory in Princeton, New Jersey, who is using similar techniques to study the Atlantic Ocean. “This is just the beginning for this approach.”

The forecast is only possible because better figures are available on the state of the world’s oceans, says Doug Smith, a climate modeller who developed the predictions with colleagues at the Hadley Centre for Climate Prediction and Research in Exeter, UK.

A network of automated ocean-going devices, now numbering around 3,000, has been deployed around the planet since 1999. The devices, known as Argo floats, provide updates on ocean temperature and salinity – factors that are critical in determining global climate patterns.

Armed with the Argo results, Smith was able to create a climate model that started with an accurate representation of the world’s oceans. Without access to such data, traditional models had ignored the fine details of current climate. That meant the predictions they produced were only reliable for periods decades in the future, at which point the influence of variations in factors like ocean temperature will have been swamped by more powerful forces, such as greenhouse warming.
Temperature plateau

Smith’s approach seems to be working. Some of the figures published this week come from a trial of the model that was run in 2005. Comparisons with subsequent observations show that the model captured the recent plateau in global temperatures.

Such lulls could be used by climate change sceptics to argue that the world is not warming as predicted and that plans to cut greenhouse gas emissions are unnecessary, says Smith, so it is useful to be able to spot a brief pause in what is expected to be a steady increase. “There would be pressure not to mitigate emissions if we couldn’t predict a flattening,” he says.

Policy makers are also likely to benefit in other ways, says Zhang, since the approach used by Smith will soon be applied to regional climate models. That could eventually lead to better predictions of droughts and floods, events that are hard to predict even a few months in advance.

Electricity generators could use the model to predict demand, since energy use increases during very hot or cold spells. Smith says one UK electricity company is already buying his data and the results will also be considered by the project’s sponsors, the UK government’s Department for Environment, Food and Rural Affairs.

In summary, one year ago the prediction was that the world’s temperature was going to increase rapidly and the evidence comes from the study of oceans.

On the other hand, cooler heads (pardon the pun) are betting on the world entering a prolonged cooling phase.

Their evidence?

Studying the oceans.

Addressing the Washington Policymakers in Seattle, WA, Dr. Don Easterbrook said that shifting of the Pacific Decadal Oscillation (PDO) from its warm mode to its cool mode virtually assures global cooling for the next 25-30 years and means that the global warming of the past 30 years is over. The announcement by NASA that the (PDO) had shifted from its warm mode to its cool mode is right on schedule as predicted by past climate and PDO changes (Easterbrook, 2001, 2006, 2007) and is not an oddity superimposed upon and masking the predicted severe warming by the IPCC. This has significant implications for the future and indicates that the IPCC climate models were wrong in their prediction of global temperatures soaring 1F per decade for the rest of the century.

The cool water anomaly in the center of the image shows the lingering effect of the year-old La Nina. However, the much broader area of cooler-than-average water off the coast of North America from Alaska (top center) to the equator is a classic feature of the cool phase of the Pacific Decadal Oscillation (PDO). The cool waters wrap in a horseshoe shape around a core of warmer-than-average water. (In the warm phase, the pattern is reversed). Unlike El Nino and La Nina, which may occur every 3 to 7 years and last from 6 to 18 months, the PDO can remain in the same phase for 20 to 30 years.

As shown by the historic pattern of PDOs over the past century and by corresponding global warming and cooling, the pattern is part of ongoing warm/cool cycles that last 25-30 years. Each time the PDO mode has shifted from warm to cool or cool to warm, the global climate has changed accordingly. In 1977, the PDO shifted from cool mode to warm mode and set off the global warming from 1977 to 1998, often referred to as the “Great Climate Shift.” The recent shift from PDO warm mode to cool mode is similar to the shift that occurred in the mid-1940’s and resulted in 30 years of global cooling. The global warming from ~1915 to ~1945 was also brought on by a mode shift in the PDO. Every indication points continuation of the PDO patterns of the past century and global cooling for the next 30 years. Thus, the global warming the Earth has experienced since 1977 appears to be over!

Add the PDO data to the fact that the sun has gone very quiet lately and it’s impossible to rule out the strong possibility that the world is going to enter a cooling phase.

So what’s it going to be?

Scorching heat or devastating cold?

I’d take the heat option any day of the week.

Cold kills. Warmth brings life.

Unfortunately for Australia we’re about to inflict the greatest burden on our economy in history with the government’s emissions trading scheme. A burden that will see the use of the fossil fuels required to heat our homes in a cooling environment become much more expensive.

Who will that hurt? Those who can least afford it. As always.

That’s why people who proselytise the man made global warming position are such immoral bastards.

(Nothing Follows)

Categories: Climate Change