Archive for July, 2008

>Announcing the death of Climate Science as we know it

>It’s fair to say that the science underpinning the proposition that mankind is responsible for the warming the planet has experienced over the last hundred or so years has copped a battering.

Less battered in the public arena are political operatives, the Climate Liars, such as the IPCC, Al Gore, James Hansen, the Hockey Stick team and Big Green environmental groups who currently hold public opinion in their grasp.

Europe is already suffering the economic pain inflicted on its people by the travesty of pandering to misanthropic green groups with the unilateral introduction of carbon trading laws.

While the extent and cause can be argued, it’s also clear that many developing nations are suffering the unintended consequence of higher food prices brought about by the use of ethanol as a fuel substitute.

Climate change policies have as their platform the output of a variety of climate models. As I’ve pointed out previously, these models have a zero percent prediction rate, which makes one wonder how history will view the politicians who used them to further their political goals when the consequences are so dire.

So far, climate science resembles astrology more closely than real science and climate models’ predictions are closer to horoscopes than anything meaningful.

If climate science is indeed real science then it should make predictions that can be tested. As Karl Popper tells us, the essence of science is falsifiability. If it can be falsified then it is wrong.

So, taking a purely scientific approach to the analysis of climate science is there anything that kills the current predictions of climate science?

As it happens there is.

Before I go to that I’ll provide a similar example of a theory making a prediction that could be tested. A negative result would have killed the theory stone dead. A positive result didn’t make the theory fact but maintained the theory’s ongoing relevance.

That was the Big Bang Theory.

If you’ve read George Smoot’s book Wrinkles In Time then you’ll be familiar with the story.

In order for the big bang theory to be correct there must exist a background radiation signature that could be detected. Smoot, along with John Mather, led a team of scientists in the search for the signal. In 1989 NASA launched its Cosmic Background Explorer (COBE) satellite.

After more than two years of observation and analysis, the COBE research team announced on 23 April 1992 that the satellite had detected tiny fluctuations in the CMB, a breakthrough in the study of the early universe.

Thus, a key prediction of the theory was shown to be true. For their work Smoot and Mather won the Nobel Prize in Physics in 2006.

In order for climate models to be correct – and thus the climate science upon which they are based to be correct – they must make predictions that can be tested.

The most important is that of an increased greenhouse effect is a hotspot in the atmosphere over the tropics.

In Dr David Evans’ recent paper The Missing Greenhouse Signature he states:

Each possible cause of global warming has a different pattern of where in the planet the warming occurs first and the most. The signature of an increased greenhouse effect is a hotspot about 10 km up in the atmosphere over the tropics.

We have been measuring the atmosphere for decades using radiosondes—weather balloons with thermometers that radio back the temperature as the balloon ascends through the atmosphere. They show no hotspot whatsoever.

So an increased greenhouse effect is not the cause of the recent global warming. So we now know for sure that carbon emissions are not a significant cause of the global warming.

To support this claim he provides the following evidence:

The theoretical signatures come from the latest big report from the IPCC, which is the most authoritative document for those who believe carbon emissions caused global warming. The IPCC Assessment Report 4 (AR4), 2007, Chapter 9. Figure 9.1, in Section, page 675, shows six greenhouse signature diagrams. (

In each diagram the horizontal axis is the latitude, from the north pole (90 degrees north) through the equator to the south pole (90 degrees south). The vertical axis shows the height in the atmosphere, marked on left hand side shown as 0 – 30 km (and on the right hand side as the corresponding air pressures in hPa). The coloured regions on each diagram shows where the temperature changes occur for each possible cause (red +1°C, yellow +0.5°C, green −0.5°C, blue −1°C per century).

…The other main authoritative source for the case that carbon emissions caused global warming is the US Climate Change Science Program (CCSP). Atmospheric temperatures have been measured by radiosondes (at all heights) since the 1960s, andby satellites using microwave sensors (up to 5 km) since 1979. The CCSP published the results for 1979 – 1999 in part E of Figure 5.7 in section 5.5 on page 116:

The axes and colours are as per the signature diagrams above, except that the horizontal axis only goes from 75 degrees north to 75 degrees south, there is no data
around 60 degrees south, the vertical axis only goes up to 24 km, and dark blue above becomes purple here. The data is called the “HadAT2 temperature data”.

This diagram is confirmed by more radiosonde data collected after 1999, and also after May 2006 when this diagram was published.

Evans concludes:

The theoretical combined signature expected by the IPCC contains a prominent and distinct hotpot over the tropics at 8 – 12 kms. This hotspot is the signature feature of an increase in greenhouse warming.

The observed signature at 8 – 12 km up over the tropics does not contain a hotspot, not even a little one. Therefore:

  1. The IPCC theoretical signature is wrong. So the IPCC models are significantly wrong.
  2. The signature of increased greenhouse warming is missing. So the global warming from 1979 to 1999 was not due predominately to increased greenhouse warming, and was therefore not due to carbon emissions.

The observed signature shows cooling above 16 km, which strongly suggests that the global warming was not due to increased solar irradiation, volcanoes, or increased industrial pollution (aerosols). The observed signature looks like a combination of increased ozone depletion, possibly a decrease in industrial pollution, and an unknown signature or signatures.

When the signature was found to be missing, alarmists objected that maybe the readings of the radiosonde thermometers might not be accurate and maybe the hotspot is there but went undetected. The uncertainties in temperature measurements from a radiosonde are indeed large enough for a single radiosonde to maybe miss the hotspot. Yet hundreds of radiosondes have given the same answers, so statistically it is not possible that they collectively failed to notice the hotspot.

Recently the alarmists have suggested we ignore the radiosonde thermometers, but instead take the radiosonde wind measurements, apply a theory about wind shear, and run the results through their computers to estimate the temperatures. They then say that the results show that we cannot rule out the presence of a hotspot. If you believe that you believe anything.

Climate science makes a key prediction that can be tested.

When tested the prediction is shown to be false.

Therefore, climate science is shown to be false.

Climate Science is dead. Long may it rot in hell.

Long live real science.

(Nothing Follows)

Categories: Climate Change

>If it wasn’t so serious then it’d be funny

>If the issue of climate change was not as serious as it is then the ongoing destruction of the Climate House Of Cards would be hilarious.

I have said for ages that climate models have not been accurate ever. Demetris Koutsoyiannis had previously shown this to be the case and goes into even more detail about how bad the models are in his new paper, which is well worth a read.

The summary of his paper is:

Geographically distributed predictions of future climate, obtained through climate models, are widely used in hydrology and many other disciplines, typically without assessing their reliability. Here we compare the output of various models to temperature and precipitation observations from eight stations with long (over 100 years) records from around the globe. The results show that models perform poorly, even at a climatic (30-year) scale. Thus local model projections cannot be credible, whereas a common argument that models can perform better at larger spatial scales is unsupported.

Given the paucity of data supporting the proposition that CO2 is the main driver of climate change and the models’ complete inability to forecast accurately one would think that the issue would fade from public view.

Au contraire, mon ami.

There is one little problem with that conclusion.

Politics trumps science.

In the 1930s Stalin allowed the implementation of Trofim Lysenko’s disastrous agricultural policies. In James Hansen the 21st century has its own Lysenko.

It should come as no surprise that politicians use science for their own ends and, in the event that the science is falsified or doesn’t support the (typically) imposition of new taxes on the public at large, they simply do the three monkeys thing and ignore the inconvenience.

That’s why we’re in so much danger here in Australia. Both the Labor government and pissweak Coalition opposition favour the introduction of an emissions trading scheme.

For gawdsake – it snowed in Sydney the other day; the coldest Sydney day for over 40 years. Can’t these politicians stick their head out the window for a brief moment to discover the truth. Are their minds so closed to scientific reality?

Apparently so.

The Australian population is yet to understand that the emissions trading scheme is simply another tax on them. Governments around the world are using business as a proxy for the collection of more tax dollars. Any cost increase to inputs to business simply get passed straight on to the consumer.

There is no difference to the average taxpayer between introducing an emissions trading scheme and increasing the GST from 10% to around 15% in the short term and 30%-40% in order to hit the targets outlined in the Garnaut Report.

By introducing the emissions trading scheme the government is telling Australian Working Families that it is prefers to support China’s and India’s economic growth and standard of living over our own.

Western apologists are putting the argument that India and China need the chance to ‘catch up’ in terms of economic development and that, after all, it was the West that created the problem. This is certainly the position being taken by those countries and other developing nations such as Brazil.

The problem with that logic is that these countries had the same opportunity to grow their economies since 1950 but chose instead to implement the Marxist economic policies that impoverished all but the fortunate few at the top.

India and China invested no money at all into the development of the technologies that they are now able to take advantage of with their new found wealth. The West paid the carbon price for that.

Why not charge them a ‘catch up’ amount that would be in the same order as, say, a carbon emissions scheme?

Perhaps Australia’s Mandarin speaking Prime Mandarin doesn’t want to get offside those people for whom he obviously has more regard than he does for Australian Working Families.

As I said, if it wasn’t so serious then it’d be funny.

(Nothing Follows)

Categories: Climate Change

>More on Peak Oil hooha

>In the picture tells the story category come the following graphs courtesy of the always informative Peak Oil Debunked.

Where’s the problem?

(Nothing Follows)

Categories: Energy

>Climate data fudged, consensus breaking down. Politicians still ignore reality.

>The great scandal of climate science is not only the manipulation of data being undertaken by people like James Hansen but also the failure of the mainstream climate community to make public their research data in spite of the fact that it’s been created using billions of dollars of the public’s money.

The Climate Fortress attitude is best summed up by Phil Jones’ response to an enquiry from Steve McIntyre:

We have 25 or so years invested in the work. Why should I make the data available to you, when your aim is to try and find something wrong with it. There is IPR to consider.

Intellectual property rights have something to do with the accuracy of science? Jones has had his snout in the public trough for 25 years and his data is not available?

Jones is afraid of someone finding something wrong with his data?

What sort of scientist is afraid of the truth?

A climate scientist, of course.

These people will be remembered along with Lysenko and Hwang as modern day scientific shysters.

The facade that is climate science is breaking down under an increasing barrage of analysis being done by scientists who are no longer afraid to speak out, as the UK Telegraph’s Christopher Brooker highlights.

Considering that the measures recommended by the world’s politicians to combat global warming will cost tens of trillions of dollars and involve very drastic changes to our way of life, it might be thought wise to check the reliability of the evidence on which they base their belief that our planet is actually getting hotter.

There are four internationally recognised sources of data on world temperatures, but the one most often cited by supporters of global warming is that run by James Hansen of Nasa’s Goddard Institute for Space Studies (GISS).

Hansen has been for 20 years the world’s leading scientific advocate of global warming (and Al Gore’s closest ally). But in the past year a number of expert US scientists have been conducting a public investigation, through scientific blogs, which raises large question marks over the methods used to arrive at his figures.

First they noted the increasingly glaring discrepancy between the figures given by GISS, which show temperatures continuing to race upwards, and those given by the other three main data sources, which all show temperatures having fallen since 1998, dropping dramatically in the past year to levels around the average of the past 30 years.

Two sets of data, from satellites, go back to 1979: one produced by Dr Roy Spencer, formerly of Nasa, now at the University of Alabama, Huntsville, the other by Remote Sensing Systems. Their figures correspond closely with those produced by the Hadley Centre for Climate Studies of our own Met Office, based on global surface temperature readings.

Right out on their own, however, are the quite different figures produced by GISS which, strangely for a body sponsored by Nasa, rely not on satellites but also on surface readings. Hansen’s latest graph shows temperatures rising since 1880, at accelerating speed in the past 10 years.

The other three all show a flattening out after 2001 and a marked downward plunge of 0.6 degrees Celsius in 2007/8, equivalent to almost all the net warming recorded in the 20th century. (For comparisons see “Is the Earth getting warmer, or colder?” by Steven Goddard on The Register website.)

Even more searching questions have been raised over Hansen’s figures by two expert blogs. One is Climate Audit, run by Steve McIntyre, the computer analyst who earlier exposed the notorious “hockeystick” graph that was shamelessly exploited by the Intergovernmental Panel on Climate Change and Al Gore. (This used a flawed computer model to suppress evidence that the world was hotter in the Middle Ages than today.) The other site is Watts Up With That, run by the meteorologist Anthony Watts.

It was McIntyre who last year forced Hansen to publish revised figures for US surface temperatures, to show that the hottest years of the 20th century were not in the 1990s, as Hansen had claimed, but in the 1930s. He has now shown that Hansen had been adjusting almost all his pre-1970 global temperature figures downwards, by as much as 0.5 degrees, and his post-1970 figures upwards.

Although Hansen claimed that this only resulted from more careful calculations, McIntyre pointed out how odd it was that the adjustments all seemed to confirm his thesis.

Watts meanwhile has also been conducting an exhaustive photographic survey of US surface weather stations, showing how temperature readings on more than half have been skewed upwards by siting thermometers where their readings are magnified by artificial heat-sources, such as asphalt car parks or air-conditioning systems.

All this has raised such doubts over the methodology behind the GISS data that informed observers are calling for it to be independently assessed. Hansen himself is notoriously impatient of any criticism of his methods: earlier this month he appealed to Congress that the leaders of those who question global warming should be put on trial.

It is still too early to suggest that the recent drop in temperatures shown by everyone but him is proof that global warming has stopped. But the fact is that not one of those vaunted computer models predicted what has happened to temperatures in recent years. Yet it is on those models (and Hansen’s alarmist figures) that our politicians are basing all their proposals for irrevocably changing our lives.

(Nothing Follows)

Categories: Climate Change

>Sunday night rock ‘n’ roll

>Jesus Jones is a British London-based rock group that recorded and performed in the late 1980s, throughout the 1990s, and into the 2000s.

Incorporating elements of electronic music styles such as house and techno to an indie rock format, along with fellow British groups such as The Shamen, Pop Will Eat Itself and EMF, Jesus Jones were one of the leading purveyors of the early 1990s “indie dance” scene. The band is led by Mike Edwards.

They achieved initial critical acclaim with their 1989 album Liquidizer, and in particular, the single “Info Freako”, which featured buzzing rock guitars with samples and a hip-hop sensibility, relatively new for the time. The track was particularly championed by Bruno Brookes on his Radio 1 evening show.

In the spring of 1990, Jesus Jones recorded their second album, Doubt, but their label was forced to delay its release until the beginning of 1991. The album sold very well, due to the success of their best-known hit “Right Here, Right Now”. The song is about the swift end of the Cold War, and was a No. 2 hit in the U.S. but reached only No. 31 in the UK; and which was resurrected in 2006 as an advertising jingle for the American retailer Kmart, an image campaign for CBS News, and was used in promotional advertisements for the now defunct TV channel, TechTV.

Other hit singles from the Doubt album included “Real, Real, Real” and “International Bright Young Thing”. In the year that Doubt was released, Jesus Jones won the “Best Newcomer” award at the MTV Awards.

The follow-up to Doubt was Perverse which, although a big seller, did not reach the worldwide hit status of Doubt.

International Bright Young Thing

Real, Real, Real

Right Here, Right Now

(Nothing Follows)

Categories: Music

>The lie of blistering temperatures in Sydney, Melbourne and Adelaide

July 26, 2008 6 comments

>In an article published in The Australian it is claimed that extreme temperatures of more than 50C (122F) will become part of the climate landscape for the major Australian cities of Sydney, Melbourne and Adelaide.

To call this claim ‘scientific research’ is to misuse the term completely.

How does it come about that such extreme temperatures could be predicted?

Read the article first…

MELBOURNE, Adelaide and Sydney will blister in temperatures of more than 50C by 2050, according to the first hard look at the impact of climate change on extreme weather.

The forecast is part of a long-term prediction that temperatures on the hottest day of the year will rise dramatically in parts of southern Australia, including the southern Murray-Darling Basin, much of coastal NSW, Victoria and South Australia.

But the study did not find evidence that other parts of Australia would be so severely affected.

“No one’s ever looked at these numbers before,” said Andy Pitman, co-director of the University of NSW Climate Change Research Centre in Sydney.

Scientists with the CSIRO and the Australian Bureau of Meteorology have also assessed the nation’s future climate but they focused on average changes in extremes of temperature and rainfall due to climate change.

Along with graduate student Sarah Perkins, Professor Pitman analysed daily temperatures. “There is nothing wrong with what they did, but they missed that last bit of evidence that identified the ‘extreme’ extremes,” Professor Pitman said.

The researchers first tested the effectiveness of many climate modelling systems by “hind-casting”, testing how well they predicted past conditions.

After identifying the most reliable models, they simulated daily changes in temperature and rainfall as greenhouse gases increased in the atmosphere. They found the increase altered the pattern of warming for rare super-hot days.

To their surprise, there was also an indirect effect. Global warming led to a reduction in rainfall which, in turn, reduced evaporation. “If there’s less evaporation, the land surface becomes hotter, a process known as positive feedback,” Professor Pitman said.

That is why extreme events in places such as Darwin and Perth did not outpace those in the south: there’s no feedback there.

…the research comes from analysing models and how they performed when hind-casting. Taking the most accurate models, as stated in the article, the models are then run forward in time to 2050 and, lo and behold, 50C is the answer.

Climate models are created by programming a number of algorithms related to well understood atmospheric physics. Past temperatures are then assessed against a model’s output and adjustments are made until the model represents the historical record accurately.

For example, while CO2 rose from 1940 to 1975 the world’s temperature fell. How did model makers overcome this inconvenience? By ascribing a cooling effect to particulate matter. What parameter was assigned to this cooling effect? Exactly enough to make their models correct. How convenient.

Adjustments like this are going on every day in every model.

Now, if you take one of these models and run it so that it produces a forecast then the reliability must be extremely questionable given that the fudges are statistically invalid.

Unsurprisingly, no model has ever accurately forecast the world’s climate.

For these ‘researchers’ to take the model that has the best fudges so that they can hind-cast and then rely on the output to claim that Sydney, Melbourne and Adelaide are in for 50C days can only be called one thing.

A lie.

Why a lie? Why not wrong?

Because these researchers know full well that models have no predictive ability and are created by a series of fudges.

Therefore, they knew what the result would be before they began their ‘research’.

The Climate Faithful have, in my view, moved from being simply wrong to being downright liars.

They ignore the mounting evidence that they’re wrong and are now screaming even more loudly that we’re all doomed.

The scientific battle is all but over. Unfortunately, the political battle still rages.

Let’s hope that in Australia we can avoid the destruction of our global competitiveness by the planned introduction of an emissions trading scheme.

(Nothing Follows)

Categories: Australia, Climate Change

>Education and the upper 50% IQ

July 25, 2008 2 comments

>Charles Murray continues his look at why educational outcomes are not what society expects.

As I mentioned yesterday, how many politicians are going to take on the reality of IQ’s effect on education when they can take the easier opyion of tax and spend instead?

Many people are of the belief that smaller class sizes equals better outcomes. This may be true in certain circumstances but worldwide research shows that class size is a low predictor of educational outcome for a student. Teacher quality is the strongest predictor, as demonstrated by countries like Finland, which has large class sizes and teachers for whom it is mandatory to have a masters degree.

Smaller class size is pushed heavily be education unions, as it results in more teachers. The real effect is to lower teaching standards, as there are more salaries to be paid to teachers out of a bucket that does not increase proportionally in size.

The other interesting thing to note is that IQ tests given to 8 year olds – adjusted for age etc – have a 75%-80% success rate at predicting high school outcomes, which supports the argument that Charles Murray makes:

The topic yesterday was education and children in the lower half of the intelligence distribution. Today I turn to the upper half, people with IQs of 100 or higher. Today’s simple truth is that far too many of them are going to four-year colleges.

Begin with those barely into the top half, those with average intelligence. To have an IQ of 100 means that a tough high-school course pushes you about as far as your academic talents will take you. If you are average in math ability, you may struggle with algebra and probably fail a calculus course. If you are average in verbal skills, you often misinterpret complex text and make errors in logic.

These are not devastating shortcomings. You are smart enough to engage in any of hundreds of occupations. You can acquire more knowledge if it is presented in a format commensurate with your intellectual skills. But a genuine college education in the arts and sciences begins where your skills leave off.

In engineering and most of the natural sciences, the demarcation between high-school material and college-level material is brutally obvious. If you cannot handle the math, you cannot pass the courses. In the humanities and social sciences, the demarcation is fuzzier. It is possible for someone with an IQ of 100 to sit in the lectures of Economics 1, read the textbook, and write answers in an examination book. But students who cannot follow complex arguments accurately are not really learning economics. They are taking away a mishmash of half-understood information and outright misunderstandings that probably leave them under the illusion that they know something they do not. (A depressing research literature documents one’s inability to recognize one’s own incompetence.) Traditionally and properly understood, a four-year college education teaches advanced analytic skills and information at a level that exceeds the intellectual capacity of most people.

There is no magic point at which a genuine college-level education becomes an option, but anything below an IQ of 110 is problematic. If you want to do well, you should have an IQ of 115 or higher. Put another way, it makes sense for only about 15% of the population, 25% if one stretches it, to get a college education. And yet more than 45% of recent high school graduates enroll in four-year colleges. Adjust that percentage to account for high-school dropouts, and more than 40% of all persons in their late teens are trying to go to a four-year college–enough people to absorb everyone down through an IQ of 104.

No data that I have been able to find tell us what proportion of those students really want four years of college-level courses, but it is safe to say that few people who are intellectually unqualified yearn for the experience, any more than someone who is athletically unqualified for a college varsity wants to have his shortcomings exposed at practice every day. They are in college to improve their chances of making a good living. What they really need is vocational training. But nobody will say so, because “vocational training” is second class. “College” is first class.

Large numbers of those who are intellectually qualified for college also do not yearn for four years of college-level courses. They go to college because their parents are paying for it and college is what children of their social class are supposed to do after they finish high school. They may have the ability to understand the material in Economics 1 but they do not want to. They, too, need to learn to make a living–and would do better in vocational training.

Combine those who are unqualified with those who are qualified but not interested, and some large proportion of students on today’s college campuses–probably a majority of them–are looking for something that the four-year college was not designed to provide. Once there, they create a demand for practical courses, taught at an intellectual level that can be handled by someone with a mildly above-average IQ and/or mild motivation. The nation’s colleges try to accommodate these new demands. But most of the practical specialties do not really require four years of training, and the best way to teach those specialties is not through a residential institution with the staff and infrastructure of a college. It amounts to a system that tries to turn out televisions on an assembly line that also makes pottery. It can be done, but it’s ridiculously inefficient.

Government policy contributes to the problem by making college scholarships and loans too easy to get, but its role is ancillary. The demand for college is market-driven, because a college degree does, in fact, open up access to jobs that are closed to people without one. The fault lies in the false premium that our culture has put on a college degree.

For a few occupations, a college degree still certifies a qualification. For example, employers appropriately treat a bachelor’s degree in engineering as a requirement for hiring engineers. But a bachelor’s degree in a field such as sociology, psychology, economics, history or literature certifies nothing. It is a screening device for employers. The college you got into says a lot about your ability, and that you stuck it out for four years says something about your perseverance. But the degree itself does not qualify the graduate for anything. There are better, faster and more efficient ways for young people to acquire credentials to provide to employers.

The good news is that market-driven systems eventually adapt to reality, and signs of change are visible. One glimpse of the future is offered by the nation’s two-year colleges. They are more honest than the four-year institutions about what their students want and provide courses that meet their needs more explicitly. Their time frame gives them a big advantage–two years is about right for learning many technical specialties, while four years is unnecessarily long.

Advances in technology are making the brick-and-mortar facility increasingly irrelevant. Research resources on the Internet will soon make the college library unnecessary. Lecture courses taught by first-rate professors are already available on CDs and DVDs for many subjects, and online methods to make courses interactive between professors and students are evolving. Advances in computer simulation are expanding the technical skills that can be taught without having to gather students together in a laboratory or shop. These and other developments are all still near the bottom of steep growth curves. The cost of effective training will fall for everyone who is willing to give up the trappings of a campus. As the cost of college continues to rise, the choice to give up those trappings will become easier.

A reality about the job market must eventually begin to affect the valuation of a college education: The spread of wealth at the top of American society has created an explosive increase in the demand for craftsmen. Finding a good lawyer or physician is easy. Finding a good carpenter, painter, electrician, plumber, glazier, mason–the list goes on and on–is difficult, and it is a seller’s market. Journeymen craftsmen routinely make incomes in the top half of the income distribution while master craftsmen can make six figures. They have work even in a soft economy. Their jobs cannot be outsourced to India. And the craftsman’s job provides wonderful intrinsic rewards that come from mastery of a challenging skill that produces tangible results. How many white-collar jobs provide nearly as much satisfaction?

Even if forgoing college becomes economically attractive, the social cachet of a college degree remains. That will erode only when large numbers of high-status, high-income people do not have a college degree and don’t care. The information technology industry is in the process of creating that class, with Bill Gates and Steve Jobs as exemplars. It will expand for the most natural of reasons: A college education need be no more important for many high-tech occupations than it is for NBA basketball players or cabinetmakers. Walk into Microsoft or Google with evidence that you are a brilliant hacker, and the job interviewer is not going to fret if you lack a college transcript. The ability to present an employer with evidence that you are good at something, without benefit of a college degree, will continue to increase, and so will the number of skills to which that evidence can be attached. Every time that happens, the false premium attached to the college degree will diminish.

Most students find college life to be lots of fun (apart from the boring classroom stuff), and that alone will keep the four-year institution overstocked for a long time. But, rightly understood, college is appropriate for a small minority of young adults–perhaps even a minority of the people who have IQs high enough that they could do college-level work if they wished. People who go to college are not better or worse people than anyone else; they are merely different in certain interests and abilities. That is the way college should be seen. There is reason to hope that eventually it will be.

(Nothing Follows)

Categories: Education, United States