You are here:   Academia > Science Is Golden
 

In a few weeks, after much delay, one of the most extraordinary machines ever built by humankind will be switched on. In a scene probably analogous to the inauguration of Stonehenge (which this device superficially resembles, in its circular form as well as probable function), assembled dignitaries and high-priests of science will witness the throwing of a (no doubt wholly symbolic) switch and a new era of particle physics will begin.

Then, about 300 feet beneath the Swiss-French border, in the shadow of the Alps near Geneva, in a tunnel roughly the length and diameter of the London Circle Line, small packets of atoms will be whirled around at unimaginable velocities by magnets of unfathomable power, to be smashed into each other in a series of miniature, ­titanic cataclysms that will mimic, for just an iota of time, the flashes of Creation itself.

The Large Hadron Collider (LHC) — for that is the name of this gargantuan edifice — is nothing less than a cathedral of knowledge, perhaps the modern equivalent of the great stone observatories of the ancient world. Like Stonehenge, it was built using the resources of a people who for the most part will know little, understand less and care not at all about whatever mind-bending conclusions the scientists at CERN, the European Nuclear Research Centre, derive from its workings. And this arguably doesn’t matter.

The LHC, which cost about €4 billion — or more, or less, depending on whom you believe — is one of those projects that divides the human race neatly into the large majority who tut and talk about money wasted, and the small minority who realise that, without machines like this, we are as nothing. For the Large Hadron Collider, whose job is to probe the fundamental structure of matter, the universe’s creation, the nature of mass and the truth about that elusive substance physicists call “dark matter”, is a symbol of what it means to be human — a triumph of our civilisation. That’s the good news; the bad news is that the mindset of which the LHC is a symbol and which has been in the ascendant for more than three centuries is now under threat.

To realise why, and why this is important, it is necessary to ask a rather profound question: what are we humans for? This, of course, seems a fatuous, trite and unanswerable question whose real answer is probably closer to “eat, reproduce and die” than anything glorious concerning God’s purpose or some grand design. Douglas Adams’s joke that the meaning of Life, the Universe and Everything is “forty-two” was funny because it summed up the futility of several millennia of navel-gazing.

And yet…. While the meaning of life may elude us, the purpose of life may not be so opaque. They are two different things. And while the purpose of humanity is, of course, open to debate, it is at least arguable that, if there is one, a clue to its nature is in the formal name we have given to our species, Homo sapiens, the “wise man”.

Some may consider this a misnomer; certainly we are not all worthy of the title. But as a species, we are certainly, if not wise, then very clever indeed. We may share 99 per cent of our DNA with our closest cousins on the tree of life, the chimpanzees, but in mental terms we might as well live in a different forest. The human brain is a quite unprecedented (in evolutionary terms) and extraordinary piece of kit, more complex than any other object in the universe. And if this brain has a purpose, then it is, surely, to discover as much as possible about the world. To date, the most effective way these brains have found of doing this is through the medium of science. That means spending money on telescopes and microscopes, spaceships and laboratories, student grants and huge machines like the Large Hadron Collider.

For science is, surely, the crowning triumph of our civilisation. The fire kindled in ancient Greece and Babylon, kept alive by medieval Islam and finally allowed to flourish under the beneficent aura of modern European Christianity, has now come to define our world. By ­“science” I do not mean any particular discoveries, exciting and ground-breaking though they may be. I do not mean particular people, or the technologies that may flow from scientific discovery, impressive and useful as they often are. It is more fundamental than that.

While great works of art can force us to rethink the way we see aspects of the world, only science has forced mankind to rethink the way it sees everything. Science says, “we don’t know, but maybe we can find out”. It is the ultimate deterrent against ignorance, and the antidote to intellectual fatalism: “it’s a mystery”, “it’s God’s will”, “it’s magic”. The benefits of scientific thinking — the rigour of the whole scientific method, encompassing as it does evidence-based decisions, experiment, replication, falsification, hypothesis and open debate — are so obvious and so far-reaching (applicable in the courtroom, parliament and even around the dining table as much as in the laboratory) that it is hard to believe that this greatest of mental tools could be under threat; but it is and from three different directions.

First there is the growing belief in some countries, including Britain, that the purpose of science should be primarily utilitarian. This is a dangerous argument because it is so superficially seductive. Forget all that ivory-tower, blue-sky nonsense: go away to your labs and make us a new iPod or better toaster or more drugs. Poll after poll shows that the public demands that science be more “relevant”. Sadly this attitude of science-must-pay has now permeated the upper echelons of our current administration, and this can be seen in an extraordinary recent snafu that has left Britain’s scientists wild-eyed in disbelief.

The row in question concerned the Gemini Observatory and Britain’s contribution to it. Gemini, as the name suggests, consists of two huge identical telescopes, one in Hawaii and the other in Chile, run by an international consortium. A few months ago it was claimed that Britain was planning to renege on its share of the running costs of the telescopes. British scientists erupted with fury; the Gemini partners reacted with petulance, even unscrewing the Union flag from the facilities, and threw us out of the consortium.

Eventually things were sorted out — sort of. The official line is that it was a “misunderstanding”, that multilingual wires were crossed and that Britain never intended to withdraw from Gemini. Keith ­Mason, the controversial and, in some quarters, unpopular head of the Science and Technology Facilities Council (which allocates money for projects like this), says it was a storm in a teacup and a “non-story”.

Maybe. But the feeling persists that under the current regime “pure” research for research’s sake is going to have to take a back seat to “applied” research — i.e. research that may lead to more iPods, or medicines, or even just highly-trained brains that can be turned into hedge-funders and bank-wreckers. And while Britain may be back in Gemini, there is no doubt that the 25 per cent cuts in physics and astronomy funding that are coming into force right now will hurt hard.

So what is science for? To advance the cause of human knowledge, surely. But people — even scientists — do not always see it like that. Here we come to the second threat: that of overspecialisation and the loss of the polymathic maverick, the thinker outside the box who might often be wrong, or mad, but may occasionally come up with relativity.

I would guess, for instance, that if you asked 100 professional bio­logists today where, when and how life on Earth began — surely among the most fundamental of scientific mysteries — the answers you would get would be little more informed than if you asked 100 physicists or even — dare I say it — 100 random people on the street. The reason is simple: as a biologist you get grants to come up with new drugs and new bugs, not to find the origin of life — which isn’t going to make any shareholders a penny. Even if your research is not commercially driven, by the time you enter the postgraduate world you will be working in a specialism of incredible narrowness, probably concerned with just one sort of protein or gene, or one species of microbe.

Things are particularly bad in the field of physics. Not many physicists admit this, but this most basic of disciplines has not seen a major breakthrough for a generation — something which has not happened, as the American physicist Lee Smolin argues, since the 19th century.

The reason for this may again be partly to do with this ghettoisation and the way the peer-review system works. The dominance of the Big Thinkers, scientists capable of peering far beyond their professional horizons and getting paid to do so, is long over. If you asked Charles Darwin whether he was a biologist or a geologist or a meteorologist or indeed a physicist, he would look bemused and, once you had explained these modern terms to him, probably profess that he was all of them. Albert Einstein was not even a professional tenured physicist when he wrote his first four seminal papers during his annus mirabilis of 1905. He was working in a patent office. It’s hard to imagine a young Darwin or Einstein getting grants today.

The great leaps forward seen in science from the days of Galileo until the 1960s were often motivated by maverick thinkers who eschewed the kind of ultra-specialisation that gets you funding these days — and certainly never had to worry about commercial tie-ins or formulating “economic impact” statements. It is probably a coincidence that some of the last great physics breakthroughs were made in the era of Richard Feynman, a wonderful polymath of the old school, a lover of the bongo drums and nude dancing bars as much as of quantum electrodynamics. But there is certainly something to be said for Lee Smolin’s thesis that the current cul-de-sac in which physics finds itself — an impasse centred on an almost impossibly arcane field called string theory which may not even be testable — is largely down to the way the funding is set up: you are either a string theorist or you won’t get a grant. Mavericks need not apply.

The last great threat to science comes from the woolly heads of the militant greens, the homeopaths and the rest of the new-age post-modern brigade who insist that “science” is to be accorded no more respect than, say, deconstructionism or Marxist literary analysis. The idea that science is just a matter of opinion, just another way of looking at the world, is dangerous and growing, among the Right as well as the Left and among the godless as much as the fundamentalist.


It is unlikely that these three threats will unite to undermine humanity’s greatest achievement. Science is too obviously “right” to be overthrown no­w. Moreover it is the only belief system we have found that says it is happy to be proved wrong: science’s greatest strength.Still, we should be on our guard.

What are the great mysteries that remain? There are many, and most seem to be susceptible to attack by science. Is life common or vanishingly rare in the universe? What is dark matter and its sinister cousin dark energy? How can we reconcile quantum physics and relativity? Are the laws of physics arbitrary or are they the only laws that can be? It is possible that, by using machines like the LHC, the planned space telescopes and the appliance of the world’s finest minds, these questions may be answered in my lifetime.

That is why we must persist. That is why, despite the fact that the money saved could undoubtedly feed countless Africans (though it wouldn’t actually be used for this purpose), we must continue to build the Large Hadron Collider, the Hubble Space Telescope and the other cathedrals to knowledge that are our era’s towering intellectual achievements. For without this sort of thing, we really are nothing.

View Full Article
 
Share/Save
 
 
 
 

Post your comment

CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.