In a world running out of oil and where the fear of climate change would tell us not to emit the CO2 even if we had the oil, you might think the public would be clamouring for a crash programme to increase our use of the one proven high-capacity low-carbon energy generation technology currently available to us — nuclear power. There are many perceived, and perhaps a few real, shortcomings in existing nuclear technology which explain that lack of enthusiasm. First, people think nuclear power stations are dangerous, with their fears dominated by a Chernobyl-style core runaway leading to an explosion. Second, nuclear fission produces long-lived radioactive waste, the disposal of which is perceived to be a dangerous, risky and expensive business. Third, the widespread use of nuclear power could lead to an increased risk of the proliferation of nuclear weapons because of an increased availability of enriched uranium and plutonium. And fourth, if we actually did try to generate a very large fraction of our worldwide energy usage from nuclear fission we would run out of easily recoverable uranium quite quickly unless we begin wide-scale use of some alternate to the current once-through method of burning nuclear fuel.
You can argue (the nuclear industry certainly would and I would agree) that most of these fears are based on wildly exaggerated public perceptions of the real risks. From a lifetime of building particle physics experiments that are as clean as possible of radioactive contaminants, I know very well just how ubiquitous and highly variable radioactivity is in our environment. Uranium, thorium and potassium are everywhere and all are radioactive. The odds are good that if you live in a brick house, your walls approach a large fraction of a part-per-million uranium and thorium. Despite this, people are terrified of getting a hypothetical enhanced radiation dose that might, at worst, amount to a tiny fraction of the difference in radiation exposure between one house on a block and the next. And this is in a country where almost a quarter of adults smoke cigarettes.
Another article (or whole book) could be written on the causes and effects of this irrational fear, but for the purposes of this article it doesn't really matter whether the fear is groundless. It matters only that it makes people very reluctant to use nuclear power even if it is currently the only realistically implementable alternative to the risk of catastrophic climate change. That is why I have become very interested in an alternative form of nuclear power that could greatly reduce or eliminate the problems listed above. Called the accelerator-driven sub-critical reactor, or ADSR, it could turn out to be one of the most important technologies of the next 50 years. A small group of researchers in the UK (of which, to declare an interest, I am a member) called ThorEA (www.thorea.org) is now exploring how the UK could leap into this technology and try to make it an important part of our energy mix (and if it does turn out to be important, make sure that UK industry gets a piece of the pie and we don't have to buy it all from overseas).
The ADSR: Left to its own devices, it will turn itself off
In order to explain the benefits of an ADSR, I will first have to say a bit about how nuclear reactors work. Reactors derive their energy from neutron-induced fission, where a neutron strikes a heavy nucleus and causes it to split into two lighter nuclei, releasing large amounts of energy and, crucially, a few more neutrons in the process. Only certain types of nuclei (called fissile nuclei) do this easily enough to be useful. Some examples are uranium-235 and plutonium-239. (The numbers indicate the total number of neutrons and protons in the nucleus; nuclei with the same number of protons but different numbers of neutrons are called isotopes — they have almost the same chemistry but very different nuclear properties, which is why it is both useful but extremely difficult to separate fissile uranium-235 from non-fissile uranium-238.)