The Absolute Need for Metaphysics

Where the West went wrong – and took the rest

of the world with it.

by Colin Tudge

Eastern observers of the western scene are wont to point out the sad demise of metaphysics. The idea of it – even the word – seems to have gone missing from western thinking over the past few hundred years. Certainly it’s not to be found, in word or deed, in the discourse of what passes as economics, or government, or law, or science, and certainly not as agriculture – yet everything we do and think about should be rooted in it. Indeed, the Professor of Islamic Studies at the George Washington University, Washington DC, Said Hossein Nasr, suggests that all the present ills of humanity and the Earth might reasonably be traced to the loss of metaphysics from western thinking – since western ways of thinking now dominate the whole world.

For, says Professor Nasr, metaphysics deals with “ultimate questions” – which I reckon might roughly be summarized under four headings, as follows. (1) — What is the universe really like? What lies behind the appearances – what we can see and stub our toes on? (2) – How come? Why should things be as they are? Science has defined “the laws of physics” – but why are those laws the way they are, and how come they exist at all? (3) – How much can we really know? Not just, “What are the limits of our knowledge?” but “How much are we really capable of finding out?” Finally, (4), in the light of the answers to all of the above (insofar as answers are possible) – How should we behave in this universe in which we find ourselves? What should our attitude be towards it, and our fellow creatures, and other people?

The standard western answers to all those questions, implicit in all you may read in The Wall Street Journal and The Financial Times, and espoused by politicians and by the hard-nosed scientists who draw their salaries from Monsanto and who dominate modern academe, are as follows:

(1): “Don’t be silly! What you see is what there is! Nothing lies behind the appearances, apart from the laws of physics, which we have already worked out (or soon will)! What you see is what there is!”

In other words, the standard western view is entirely materialistic. The universe and all that it contains, are just “stuff”. Other animals are not our fellow creatures — or as St Francis said, our brothers and sisters. At best, in the modern parlance, they are “biodiversity”. They are part of “the environment”, which is the politicians’ name for real estate.

Many of those who take this view strongly imply that their thinking on this is ruthlessly hard-nosed and therefore unimpeachable. In fact they claim that “What you see is what there is” emerges from science itself. But of course it doesn’t. Out-and-out materialism is simply an exercize in denial: a straightforward refusal to contemplate seriously the possibility that there might be more to the universe than meets the eye. To make this refusal seem respectable, some sceptics try to root it in the principle of Occam’s razor: a paraphrase of the warning issued by the Franciscan Friar, William of Occam, or Ockham, who in the 14th century declared that we should never, when trying to find out what is true, drag in ad hoc notions that the data don’t seem to require. Thus the materialist argument goes – “We don’t have to suppose there is more to the universe than meets the eye, do we? So don’t do it!”

Yet there is plenty of reason to suppose that there is more to the universe than meets the eye – so much so, that most people through most of history have assumed that this must be the case. So Occam’s caveat does not apply. In any case, Occam never suggested that his adage was a royal road to truth. In reality, the modern, materialist scepticism is a hangover from the days of the logical positivists, who argued about 100 years ago that questions that cannot be answered definitively are “meaningless”. So they dismissed the very idea of metaphysics out of hand. Their modern successors simply declare that “What lies behind apparent reality?” is a non-question. But actually, it isn’t.

Question (2) – “How come?” – receives the same dusty answer. It cannot be answered definitively so it should not be asked, is the modern view.  In any case, if you want to know why the laws of physics are how they are – well, we now have the multiverse theory, and half a dozen others to choose from if you don’t like that one. Of course you might ask, “But where did the multiverse come from?” – but the hard-nosed, at this point, tend to get huffy, and declare the discussion closed.

The standard modern answer to question (3) – “What can we really know?” – is that “There is no theoretical limit”. Science alone can take us as far as can be gone. As the Professor of Chemistry at Oxford, Peter Atkins, once put the matter, “There is no reason to suppose that science cannot deal with every aspect of existence. Only the religious – among whom I include not only the prejudiced but also the underinformed – hope there is a dark corner to the universe, or of the universe of experience, that science can never hope to illuminate. But science has never encountered a barrier, and the only grounds for supposing that reductionism will fail are pessimism on the part of scientists and fear in the minds of the religious”.

In other words, science, all by itself, can lead us to omniscience, and with omniscience will come omnipotence. Indeed, we are already well on the way to both. Human omniscience and our pending omnipotence formed the subtext of Sir Nicholas Stern’s much-praised report in 2006 on climate change: we know what we need to do to control global warming, he said, and we have the technologies that can do it – it’s just a question of spending the money (so we must make lots of money first, even if we make things worse in the attempt). The belief in human omniscience and omnipotence, too, lies behind present-day biotech: the belief that we really do understand living organisms well enough to be able to re-fashion them at will (we just have to re-arrange the “genetic code”); and that we have the technology to do exactly what we want; and that we can anticipate all the possible hazards. Rich companies want to believe this, and some individuals within those companies apparently do believe it, and governments are more than happy to believe what rich people tell them, and scientists who should know better are content to take their salaries and keep their fingers crossed.

As for (4) – well: there simply is no such word as “ought”, the hard-heads tell us. “Morality” in the traditional sense – absolute rights and wrong – is a dead concept. Charles Darwin told us (didn’t he?) that life is one long competition and our only role in life is simply to compete: to make sure that we survive, and reproduce, despite the opposition. The new genetics has told us (hasn’t it?) that we are all driven by our genes, and the genes that we now have are the ones that competed successfully in the past, so they can’t be bad. The modern economy is designed along Darwinian lines – which means along scientific lines, which means it can’t be wrong. Competition is the thing, above all, which in practice means we each and all of us have to acquire as big a share as possible of the global pie. Do we have a right to do this? Again – don’t be silly! The universe and the creatures within it are just stuff, and don’t care, and neither does anything else because there is nothing else; and we are stuff too, and programmed to survive and spread our genes, so there is no good reason why we shouldn’t just grab as much as possible to help us do so. That’s logical. That’s science. Can’t be bad.

This hard-nosed view of life is rarely spelled out in such stark terms. Most politicians and even most scientists in public debate couch what they have to say in what can be made to sound moral, as in: “Well, the hard-nosed view of life may be unpalatable, but it’s true – and it is good to face up to the truth!”. Or, “Indeed it may seem as if the present economy is based on greed, but it’s important that we should all try to get rich, because you can’t do anything without money, good or bad, and the more money you have, the more good you can do!” Thus Mrs Thatcher once told the Elders of the Church of Scotland to their evident bemusement that this is what Jesus was trying to convey in the parable of the Good Samaritan – for (she said) the Samaritan would not have been able to help the wounded stranger if he hadn’t been rich, so it’s essential to become as rich as possible. If she was still compos she’d be holding up Bill Gates as a model for us all.

Why does it matter?

This modern western view, thoroughly leached of metaphysics, unpolluted by any inkling of anything that cannot be seen, touched, and given a price tag, leads us to a view of human progress that has three components:

Commodification. Since the world – and our fellow creatures! – are just “stuff”, and since our only role in life is to compete, we should regard the fabric of the Earth, and other species, purely as “resource”, and we should be striving to acquire as much resource and power as possible. Indeed, if we can get away with it – and we can, surprisingly often – we are fully justified, if we follow the logic of what now passes as Darwinism, to regard other people as a “resource”, as in slavery, or what is now called “cheap labour” (which in truth can be cheaper than slavery). The point of all serious endeavour, therefore, and the root of all economies, is to turn the resources of the world, including living resources, into commodities – things that can be sold and turned into money.

Bureaucratization. Bureaucracy in essence is nothing more or less than a formal exercize in tidy-mindedness, a way of keeping track of who’s who and what’s what within the society – which is difficult when the society grows bigger than a tribe. All governments need their bureaucracies, or they wouldn’t even know who they were trying to govern; and the citizens need them too, or we could never have pension schemes, or hospitals that are more than apothecaries’ huts. So let’s hear it for bureaucrats.

But it has a downside. Too much bureaucracy reduces us all to form-fillers and box-tickers. Even worse, it is the natural and necessary agent of oppressive governments, as demonstrated both in Tsarist and Stalinist Russia – and, nowadays, albeit with a smilier face, in Britain. Civilized societies need to be tidy-minded but societies that are really worth living in also value individual freedom and individual creativity, and bureaucracy threatens both. It is seen nonetheless to be a necessary goal of “development”. It is deemed unsafe simply to allow people to do their own thing.

Centralization. It is taken to be self-evident that all groups of people must be ruled, for their own good, otherwise they will tear each other to pieces as in William Golding’s Lord of the Flies. The rulers may often in practice inherit their power over the rest of us, or more or less. But the general route to power is again Darwinian (as Darwinism is now perceived). That is, the people rise to power who are best equipped to rule. So the politicians, corporate bosses, bankers, salaried scientists and other intellectuals who now dominate are there because they are the best people for the job. The rest of us might not always like everything that the top people do – sometimes our rulers have to take “tough decisions” – but the status quo nonetheless represents the best of all possible worlds. We need top people and the people at the top are the best people for the job. It follows that we should take every step to ensure that the top people do indeed have control. Ergo, power must be centralized.

This, then, is what progress means, according to the prevailing western view; and it is indeed what “civilization” is taken to mean. Countries that espouse this view of progress – commodification, bureaucratization, and centralization – are deemed to be “developed”. Those that seem to be moving towards such a view and the ways of life that goes with it are said to be “developing”. Those that seek to pursue a different course are said to be “backward” or “laggard” (which I am told in Africa is a standard term) and if they protest too loudly they are said to be “rogue” states, or at least to be “failed” states who need taking in hand for their own good.

All of these ideas, which dominate the modern world, are junk, and most of them are vile. No wonder the world is in a mess. But what are the counter-arguments? What is the alternative?

The return of metaphysics

Those who espouse the modern western view and act upon it are absolutely sure, above all, that they must be right. Their ideas, after all, are founded in science, and can’t be wrong.

But actually, the modern view is not founded in science – or not at least when science is properly construed. If they were founded in science, and science was properly construed, they would not lead us to feel so certain, because science does not and cannot deal in certainty. In fact, the four basic questions of metaphysics can be answered in a quite different way – a way that is at least as deeply rooted and as justified as allegedly modern materialism. Thus:

(1): The notion that the universe and our fellow creatures are just “stuff” –“What you see is what there is” — is nothing more than an assertion. The alternative proposition – that there is more to the universe than meets the eye – is at least equally plausible, and indeed is what most people intuitively believe. The suggestion that there is nothing more is merely a denial which, as discussed above, has no respectable roots at all. Actually, hard-line materialism is simply a dogma. It is ironic that the hard-line materialists who include many scientists should so often so roundly condemn the dogma and doctrine of religion, when their own philosophy depends upon dogma absolutely.

(2): No definitive answer can be given to the question “How come?” We may indeed choose to give up on it on those grounds, as recommended by the logical positivists. But the fact that we choose to ignore some particular question surely does not mean that that question then ceases to have meaning, or to be important. If we cannot answer a question by science and by rationality alone, it seems to follow that if we really want to gain some insight into the ways of the universe, we have to move beyond science and rationality. This is what metaphysicians have been saying for the past 3000 years at least. (What they may have said before that is hard to know, since there aren’t many written records before that time – or not, at least, of a metaphysical nature). The extra-rational route to insight is via our intuitions. Our intuitions tell us a great deal, and there is no reason not to take them seriously. The role of religion is to cultivate intuition. The multiverse hypothesis and its stable-mates are exercizes in arm-waving, and the fact that they are put forward by scientists does not mean they are “scientific” (unless, like some modern scientists, you define “science” as whatever a scientist says it is).

(3): “How much can we know?” Our knowledge is very definitely limited. Science can describe only aspects of the universe, and only provisionally, and in any case, “description” is not reality. It is a story about reality. Rationality, in which science is conventionally said to be grounded, is itself limited in what it can tell us. In the end, the universe is beyond our ken. Most philosophers from the past few thousand years, and especially most philosophers of science from the past 100 years, would not agree with Professor Atkins, that science will one day tell us all. Science can look only at aspects of reality, and its conclusions are always provisional. Omniscience is not in our gift, and never will be, for reasons both practical and absolute. What we cannot exhaustively understand we cannot hope ultimately to control – so the dream of omnipotence is ludicrous.

If we don’t really know what we do (and we don’t) and we cannot hope for perfect control (which we can’t) then, surely, if only in the interests of our own survival, we should treat our little Earthly corner of the universe with extreme caution. Humility, in short, is the only sensible course – as well as the course recommended by most of the moralists whom the world takes seriously including, for example, the Buddha, Jesus Christ, and the Prophet Mohammed. This is the very opposite of the arrogance that now prevails in our metaphysic-free world.

(4): What is it right to do? “There are no hard-and-fast rules”, is the modern, fashionable answer:  “There is no morality. There is only survival”. Hard-line atheist are wont simply to say that there can be no absolute moral laws because there is no-one to make those laws. Morality must be a purely human invention. Many go on from there to argue that moral codes differ from society to society, so they are all just convention. Enthusiasts for the supremacy of the global market, who nowadays seem to include most politicians in the world’s most powerful governments, tend nowadays to argue that morality, like everything else, should be defined by the market. Whatever people are prepared to pay for, is ipso facto OK. I have even heard human cloning defended on these grounds. It is, after all, a marketing opportunity, or oppertoonidy, and therefore self-justifying. In fact, arguably, the only taboo that’s left in our market-dominated morality is paedophilia. Most people condemn it even though some people are prepared to pay for it.

Most people feel in their bones, though, that there are absolute or at least solid rules of morality. Indeed, the taboo on paedophilia shows that this is so. Even the materialists, who do not suppose there is more to life than meets the eye, feel that morality is somehow rooted in the way things are. But this raises a huge issue which belongs very firmly in the realm of metaphysics, rather than moral philosophy: “What is the relationship between the way the world is, and the way we ought to behave”.

Some people simply assume that what is natural is what is right – which is how the Enron CEO Jeff Skilling justified the way that he ran off with the investors’ loot, just before his company crashed. Darwin has shown that human beings are innately selfish, said Skilling (though actually he quoted, or to be fair misquoted, Richard Dawkins) and therefore he was only doing what came naturally so that was OK. I have often heard the entire neoliberal economy and all its social destructiveness justified in just these terms.

But others insist that there is no relationship, or at least no logical relationship, between what is, and what ought to be. Most famously, David Hume said in the 18th century (although this is a paraphrase) “You cannot logically derive an “ought” from an “is””. In similar but not identical vein G E Moore in the early 20th century spoke of the “naturalistic fallacy”. Quite obviously, a lot of things we would consider to be bad happen in nature, including rape and infanticide. Yet all is not so simple, as Cardinal Newman pointed out in the 19th century in response to Hume. He conceded that there is no logical path from “is” to “ought”. But, said Newman, a million lines of thinking and strands of evidence connect the two anyway, and those million strands create a firm connection just as the million threads of hemp combine to form a rope that can bind a battleship to the quayside. If Newman had been party to modern parlance he might have spoken of a “non-linear relationship” between what is and what ought to be. Certainly we are particularly repelled by acts that we feel in our bones are “unnatural”. A feeling in the bones might seem like a poor basis for making huge decisions – but in truth, in moral matters, our feelings must be the final arbiters. As David Hume said in a general context, “Reason is, and ought to be only the slave of the passions, and can never pretend to any other office than to serve and obey them.”

Then again, the idea that it is OK to be selfish and cruel because nature is selfish and cruel fails absolutely if, in fact, nature is not selfish and cruel. To be sure, Darwin emphasized competition; and Darwin was perhaps the greatest of all biologists and must be taken seriously. But as Darwin himself acknowledged (because he really was a good biologist) nature is not always competitive; and in any case, as many later biologists have pointed out, competition does not necessarily imply conflict or selfishness; and indeed, the more you look at nature, the more you see that, above all, it depends on collaboration. If nature is innately collaborative, then a morality based on nature would be most acceptable – indeed it would be what most people feel morality should be. Then if you chose to argue as Cardinal Newman did (and the Catholic Church as a whole tends to argue) that what is natural is at least relevant to what is good, you would be on easy ground.

In short, whatever way you want to argue it, the simplistic modern notion that morality is just what we choose to say it is, and can safely be allowed to emerge from the market, is the most specious nonsense.

What difference does it make?

All the difference in the world, is the short answer. If we once perceived that the universe and our fellow creatures aren’t just “stuff” – or at least, that we can’t just assume that this is so – then, surely, we would not treat them so insouciantly. If we once acknowledged that life and the universe in the end are beyond our ken, and omnipotence is a dream, we surely would not attempt to take the world and our fellow creatures by the scruff and try to beat them into shape for our own convenience. If once we saw that it is natural to be collaborative and that competition is not the great driver of everything then perhaps we would more easily see through the rhetoric of the people have assumed control and begin to trust our own more kindly natures. If we once began to suspect that the materialism and self-confidence and sanctimoniousness that have overtaken the west are not what western civilization ought to mean then we would be less keen to impose this attitude on everybody else, and listen more carefully to their different points of view. All these notions spring from metaphysics. In short, if we reintroduced the concept of metaphysics, and took it seriously, the world could be a much better and safer place.

Other-worldly though this may seem – as indeed it is – I reckon this is a priority.

Beddington Re-Visited

WHY THE WORLD NEEDS SMALL MIXED FARMS

— AND PLENTY OF FARMERS TO RUN THEM

Supporters of Sir John Beddington’s The Future of Food and Farming claim that the report is all-embracing and even “holistic”, integrating all the various means by which food is produced into one grand pluralistic plan. In truth it is no such thing. Many possibilities are mentioned but mostly in footnotes, while the central narrative has two principal themes. First, the global, “free”, neoliberal market is taken as a given – ultimately each enterprise, each country, and indeed each farmer has to produce food more cheaply than anyone else in the world.  Secondly, humanity cannot possibly hope to pull through the next few decades and centuries without heavy reliance on high technologies. Cloning is featured – even though the report stresses the need for (bio)diversity. Even nano-technology gets a look in although its only foreseeable role is to make a few rich people richer. The neoliberal market and the accompanying technophilia lead us inexorably towards industrial farming – big, monocultural estates with minimum (preferably zero) labour. The report insists, time and again, that we cannot continue with business as usual – but in the main, business as usual is what it recommends.

Most farmers worldwide, plus those scientists and economists who have truly engaged seriously with agriculture this past few decades, have in general come to the quite opposite conclusion. Most of the best-informed now advocate very mixed, complex farms that veer towards organic. Such farms are innately complex and so must be labour intensive: and if farms are mixed, integrated, complex, quasi-organic and labour-intensive there is no advantage in scale-up, so in general they should be small to medium-sized. Hans Herren, co-chair of the IAASTD (International Assessment of Agricultural Knowledge, Science, and Technology for Development) points out such farms currently produce the bulk of the world’s food and, given half a chance, they could easily be far more productive than they are. They are also sustainable – given half a chance. But these, of course, are precisely the kind of farms that are now being side-lined or actively trashed.

This is the core idea of this Campaign – that the world needs and must encourage mixed, integrated, mainly organic, labour intensive farms that are small to medium-sized. The case is not based on nostalgia, or whimsy, or elitism, or any of the other standard insults that rain from on high. It is based on basic biological principle – and on statistical assessment of the status quo. This argument for small mixed farms in truth is evidence-based: the kind of argument that the powers-that-be claim to take most seriously. Here is a nice irony: the argument that supports industrial farming, which is supposed to be “scientific” and evidence-based, in truth ignores the most striking evidence. It isn’t the advocates of small mixed farms who are the simple-minded ideologues. It is the supporters of the status quo.

But the biological principles that lead us to the small, mixed, complex, labour-intensive organic farm have to be argued carefully if they are to convince; and a very accomplished ecologist has now pointed out to me that in recent articles I have been arguing the case too simplistically. For humanity needs farms that are productive, sustainable, and resilient – able to go on producing when conditions change – and I have been arguing of late that to achieve this we merely need to emulate nature. Nature is productive – largely because it is so diverse and integrated, so that one creature’s waste is another’ provender, and all the species between them mop up all the nutrients that are going. It is certainly sustainable – it has been continuously productive for the past 3.8 billion years; and this is possible because it is so frugal – it uses only the nutrients that are on hand, and solar energy. Clearly, too, nature is resilient, for through its vast continuous span the climate and much else besides have changed dramatically, back and forth, many times.

The agricultural equivalent of natural diversity is polyculture – mixed farming: many different crops and livestock. Integrated means integrated, as in nature. Frugal means organic: no artificial stimulants or pesticides. Mixed, integrated, largely organic farms are inevitably complex. Complexity requires hands-on husbandry, so the farms must be labour-intensive. If farms are complex and labour-intensive there is no advantage in scale-up so the farms that can actually do what the world needs should in general be small to medium-sized. It’s a grand argument – but in this rhetorical form it is open to criticism. It needs to be stated more carefully.

To begin with (my ecological adviser points out) — nature at least in the short term is not always very productive. Total production of biomass in a given year (or a decade or a century) may be far less than in well-run farms. Of course, too, the point of farming is to produce good food, while most of the biomass from most wild ecosystems is not edible and certainly not palatable. Wild biomass is often non-nutritious (like wood) or frankly toxic (like the leaves of many plants, and perhaps most).

Less obviously but very much to the point, formal studies of wild ecosystems do not show a simple relationship between species diversity and long term “stability” – a concept that is hard to define but clearly overpals the concepts of sustainability and resilience. Many highly diverse ecosystems are extremely fragile, such as the fynbos of the South African uplands, which includes 500 species of the Ericaceae alone. Yet others that seem very poor in species – even “monocultural” – may persist for many thousands of years. Thus the boreal forests of Canada, which are practically the size of Europe, contain only nine species of tree, including six conifers and a couple of aspens, yet they have been there since the end of the last Ice Age.

Apparently, then, we cannot simply assert that species diversity per se solves our problems. Besides, industrial, high-input monocultural estates that are now economically de rigueur can produce more good food per hectare than small mixed farms often do. After all, the average yield of wheat in Britain is now 8 tonnes per hectare, with some of the fields of East Anglia far exceeding this; while a small mixed farmer in Africa is commonly content with a quarter of this (or its equivalent in other crops), or even less. It seems obvious, then, at least to the industrialists and their political supporters who now dominate the world, that the future lies with industrial farming (although they may sometimes concede these days that the small mixed farms may serve a stop-gap role, until the big guns can be mobilized).

But this industrial argument is highly simplistic too. Indeed it opens several horrendous cans of worms – a plethora of questions which (to the shame of the people in positions of influence) are for the most part not recognized as problems and so are not even addressed, let alone answered. Here are just a few.

First, stable but apparently simple ecosystems such as the boreal forest may seem to contain few species, yet most wild ecosystems are orders of magnitude more diverse than the monocultures of modern industrial farming. Although some of the forest trees are clonal up to a point (they spread by suckers, as the aspen does) the intra-specific genetic diversity is surely immense. An enormous range of species – insects, mites, fungi, vertebrates – live within the forest. Most species and genetic diversity are found in the soil. Notably, the boreal trees depend very heavily on mycorrhyzae, which include several hundred species of fungi – and there is often a huge variety on any one tree. By contrast, industrial monocultures, whether clonal or not, have an extremely narrow genetic base; and the soil, after years of ploughing and other cultivations plus tonnes of artificial fertilizer and many kilos of pesticide, often within irrigation, are often virtually free of any biota. Industrial agriculture, in short, is commonly a field-scale exercize in hydroponics.

In short: the boreal forest is species-poor compared to tropical forest – but it is highly diverse by agricultural standards, and obviously is diverse enough to put up with the conditions it finds itself in. It may be simplistic to argue in general terms that diverse systems in the wild are necessarily more resilient than simpler ones. But it is simplistic in spades to suggest that agricultural monocultures are just as likely to be sustainable as complex ones, just because some wild ecosystems also seem relatively simple. Wild ecosystems are almost never as simple as agro-industrial monocultures – not by orders of magnitude. (I said “almost never” in the previous sentence because I wonder about bogs of sphagnum moss. How uniform are they? Does anyone know?).

On the other hand, a great deal of hard ecological data (or as “hard” as ecology can get) shows specific advantages in diversity. Thus no-one seriously doubts – do they? – that populations that are genetically uniform, or nearly so, are more vulnerable to epidemic. Pathogens will attack diverse populations right enough but they can rarely take more than a small proportion because each new individual host presents them with a new genetic challenge. No conservationist seriously doubts that many of the wild carnivores of Africa – cheetahs, wild dogs, many large populations of lions – are extremely vulnerable precisely because they lack genetic diversity (because they have all been through “genetic bottlenecks” in the past – sometimes several times). The late Bill Hamilton, one of the outstanding evolutionary biologists of the 20th century, argued that the need for short-term genetic diversity is the primary driver of sex. As a route to multiplication, sexual reproduction is extremely inefficient – but asexual reproduction leads to genetic uniformity, which leaves the descendants open to disease. There can be no doubt, either, not least from many classical examples, that domestic crops that are too uniform are always liable to fall to some parasite.

Thus, we can and must defend the need for genetic (and phenotypic) diversity in crops and livestock not in an arm-waving rhetorical way but specifically to protect against pests and pathogens. Monocultures have to be protected by industrial chemistry – as indeed is now “conventional”, using the technology developed primarily in World War II to create agents of chemical warfare.  Biotechnology, culminating in “genetic engineering”, is supposed to reduce the need for industrial chemistry but it partakes of the same mentality: seeking to protect monocultural crops with single genes that are the genetic equivalent of the chemical magic bullet.

Still, though, defenders of the status quo argue that a monoculture, protected by pesticide, is more productive than a diversity of crops, or of single crops that are genetically diverse; and further argue that this extra (hypothetical) yield is necessary to feed us all; and that the problems created by high tech industrialization (including the total reliance on oil) can in the future be overcome with more high technology. This is one of the cans of worms that needs to be explored exhaustively yet so far as I know has never been properly addressed (to the shame of those in charge of research, who do not seem to recognize the priorities). Suffice to say here that very uniform high-input crops may sometimes be more productive than more diverse crops – but, in general, only in the short term and in special conditions; and, for example, a recent study in Nature (April 7, 2011: vol 472 pp 86-89) showed that genetically diverse assemblages of algae grew faster in natural conditions than uniform populations – because nutrients are unevenly distributed in natural conditions and the variety of species between them were able to hoover them all up more efficiently.

On the farming front, long-term studies typically show that diversity wins out. The very highest recorded yields might perhaps be achieved by monocultural high-input farming (provided the conditions are absolutely right) but this really is not the point. The world does not actually need greater yield (another convenient myth perpetrated in high places for political-commercial reasons). Adequate yields with long-term resilience are far more important – and this, emphatically, is not achieved by high-input monocultures, for reasons of disease alone (not to mention dwindling oil and all the rest).

But there is, in this whole discussion, a serious lack of data. Some individuals have tried to find the necessary data but critical studies seem to be lacking. What data exist are mostly epidemiological – just looking at what is on the ground, with all the confounding variables that this entails. Since this is a key area, the lack of critical studies is a disgrace. Humanity as a whole really should be very angry, that such important questions are neglected while taxpayers’ money is spent on high-tech “solutions” whose efficacy in large part is simply taken for granted, because it is politically convenient to do so.

It has also been argued that if we do need diversity, we can keep it “in the bottle”. That is, we can grow our monocultural, high-yielding crops for as long as they last, and then if the climate changes (as indeed is already the case) we can take another variety off the shelf, and Bob’s your uncle. This approach has been seriously mooted from people in positions of influence and seems to describe present industrial practice. But to argue thus is to ignore the realities of farming, and indeed of the physical world.

Thus, for every tonne of wheat (say) that is harvested, about one eighth of a tonne (what in the old days was called a hundredweight) of seed needs to be planted. The weather can change dramatically from year to year – and in times of overall climate change, as now, the short-term fluctuations will surely be more severe (as we are already seeing). The entire scenario could change in a decade or less: areas that were suitable for wheat suddenly becoming fit only for maize; or traditional maize fields are suddenly able only to support sorghum – or indeed semi-arid grassland; and so on. So we may find that we need to change the crop very quickly, but over millions of hectares. That means millions of tonnes of seed. So what use are these serried ranks of carefully labeled bottles? We would need warehouses full of each (and where are these warehouses?) It would take several decades to produce the quantities of seed-corn we need if we start with a few kilos.

Furthermore, the idea that we can guess what variety we will need in the season to come, and therefore select the right variety off the shelf is obvious nonsense. All sorts of assumptions are embedded in here, none of which are justified.

In the longer term, there is yet another, obvious and commonsensical argument in favour of diversity: it leaves us with more options. If conditions change (as they will) then diverse populations are more likely to contain individuals that can cope with the changes, than populations that are more uniform. That is a simple point of logic, which seems to need no argument. But of course, even more fundamentally than Bill Hamilton, Charles Darwin argued that variation is the key ingredient of evolutionary change over time. Take away the genetic diversity, and we halt evolution in its tracks. That is a very dangerous thing to do at the best of times. In times of rapid change, as now, it is suicidal.

For all kinds of reasons, the best way to maintain diversity is to keep it in the field. Crops – particularly cereal crops – need to be reasonably uniform or they cannot readily be harvested, and we don’t know what we’re getting when we plant them. But within that reasonable uniformity we can have enormous genetic diversity. In Suffolk, Professor Martin Wolfe who is both a plant geneticist and an (organic) arable farmer, has of late been out-yielding his neighbouring (conventional) farm by growing mixtures of wheat varieties. He re-plants the mixtures each year – and thus, in effect, is imitating the strategy of traditional farmers, who develop local “landrace” varieties in situ, by planting mixtures of seed and re-planting, and allowing them to interbreed in the field. This, demonstrably, is good farming rooted in sound biology. It is the kind of approach that needs serious research and support. But, as is almost invariably the case these days, matters that cry out to be studied because of their potential value to humankind are left to individuals to work on, while the bulk of the world’s money, including taxpayers’ money, is spent on industrial approaches that are assumed, as a matter of dogma, to be superior (or at least lend themselves more readily to top-down control and the centralization of power).

In truth, then — as common sense and simple biology do indeed tell us – diversity in farming is vital: a diversity of crops and livestock, with genetic variation within each. The present fashion for cloning is yet another high-tech nine-day wonder, advocated by people who may be good at what they do but seem to have no broad understanding at all. The diversity of genetically diverse animals and plants must be integrated so that each benefits from the others and nothing is wasted – in absolute contrast to the modern industrial factory farms where, for example, manure becomes a burden: the prime source of fertility reduced to a pollutant. Organic farming should not, as now, be seen as a “niche”, but as the default position: what farmers do unless there is some very good reason for doing something else. The dogma from on high which says that organic farming cannot feed the world again seems simply to be untrue – but again, although the issue is crucial, ot is not being critically examined. Diverse, integrated, organic systems do need high standards of husbandry, so we do need plenty of farmers. Overall, simple logic rooted in basic biology leads us inexorably to the small to medium sized, mixed farm – the kind which, as Hans Herren suggests, still provides most of the world’s food. But the industrial forces are gathering fast, with Africa now seen, in effect by the world at large, to be up for grabs.

As a very considerable bonus we might point out that agriculture is still the world’s biggest employer by far. Half the world’s population (around 3.5 billion people) still live in the countryside and most of them rely on farming for a living. The idea that the city provides a viable alternative is another grotesque piece of dogma. At present, an estimated one billion live in urban slums – getting on for a third of those that live in cities; and no country in the world, not even the richest, seems able to get on top of this. The problem is out of control, and this really should be acknowledged. By contrast, agrarian living when properly supported can be highly agreeable. The fashionable and convenient notion that agrarian life is always dour, and that city life is bound to be better, and represents “progress”, is yet another convenient myth perpetrated by those who benefit from the industrialization of farming, and from what is euphemistically called “urban development”.

We cannot afford to run the world on convenient myths. Reality says that industrial farming with all that it connotes is of very limited use, and that the future lies with farms that are small to medium-sized, mixed, labour-intensive, and more or less organic. This is where we should put our weight and our effort. Those who advocate high tech industrialization believe that they are “modern”, and that those who argue otherwise are victims of wishful thinking. The truth is the other way around. What now passes as “modern” agriculture and the neoliberal economy that supports it must be seen as aberrations. They have very definitely had their day and unless we wake up soon to this all too obvious fact, the world will be beyond rescue.

“GOOD FOOD FOR EVERYONE FOREVER: A people’s takeover of the world’s food supply”

My new book is now on sale !!!

The book summarizes the main ideas behind the Campaign for Real Farming and the College for Enlightened Agriculture: the ideas on which we intend to build. The text is a significant update of Feeding People is Easy.

Good Food for Everyone Forever is available from:

* The author: colin@colintudge.co.uk.

  • Bookshops

The book costs £9.99. or 13.5 euros; or £12.00 if you order from me directly (at colin@colintudge.co.uk) to cover p & p. I will send the books to you as soon I receive your order, and will enclose the invoice.

This is the book of the Campaign so please subscribe !

Eight Steps Back to the Land

Britain needs a new generation of farmers – and ten times as many as it has now. Colin Tudge suggests a route

Successive British governments this past thirty-something years have left British agriculture in disarray – and those of us who give a damn now have to pick up the pieces. We need nothing less than an agrarian renaissance and in practice only we, ordinary Joes, can bring this about: a people’s takeover. So where do we begin?

In truth, everything now needs doing. The infrastructure needed to make farming work well and in the public interest – not least the network of publicly owned scientific research stations, and Experimental Husbandry Farms, (EHFs), and university departments and colleges of agriculture – have largely been closed down or privatized. That alone might reasonably be seen as the greatest act of government-inspired vandalism since the dissolution of the monasteries. But we have also lost an entire generation of farmers, so that we now have far too few – only about one per cent of the total workforce now work full time on the land, although for a strong agrarian base we surely need at least 10 per cent, and perhaps nearer 20 per cent; and the average age of the remaining few is now approaching 60. But then, until very recent months, some people highly placed in government actively wanted British farming to go the way of its mining. Two years ago I shared a public platform with a leading civil servant who argued just that (and had been knighted for his services to agriculture).

At the same time, of course, official policy has linked our agriculture – its practice and its economy – to the oil market, which was bad at the best of times and is now disastrous; and as the climate changes we are left with a system that has become less and less diverse, and less and less capable of adaptation; and the whole fate of farming is largely determined by the whim of the supermarkets, while the agrochemical, seed, and biotech companies, each with their own agenda, are huge players too. All these big commercial forces are supported directly or indirectly by the government, which is to say by public money, and by the law, which in practice is on the side of the big guys. As the coup de grace, the paper economy of the city on which New Labour has pinned its faith, now looks very sick indeed. Only supreme optimists believe it can recover and if it did, it wouldn’t be for long, and the next relapse, in a world that by then will be even more depleted, could take us all with it. But alas! Those supreme optimists apparently include the government, whose only coherent policy in the face of collapse is to restore the status quo ante with all possible speed. Oh dear.

So our present position is very bad indeed. If our food supply wasn’t being propped up by foreign trade and imported labour we’d already be cap in hand to the United Nations and the International Monetary Fund, like many a beleaguered African country this past few decades. This is not a plea for little Britain-ism – far from it. A food trade conducted fairly and sensibly should be good for everyone. But the present approach – where rich countries use their political and economic muscle to encourage poorer countries to dedicate their fields to the needs of outsiders, while Britain throws its own farming to the winds – is grotesque. It does immense harm to the poorer exporting countries and leaves us in a precarious state indeed. Other countries have problems of their own and very few of them – any? – feel that they owe Britain any special favours. Indeed, such status as we now have in the world depends not on our present achievements but on our past glories: that we once had an Empire, and were on the winning side in both world wars and in the cold war. But the world after oil and in the throes of climate change is a whole new place.

But let us not dwell on negatives. Let’s just accept that our plight is now at least as desperate as it was after World War II and get on and do what needs to be done, just as we did then. The main difference lies with governance: for while the government of the late 1940s and ‘50s was far from perfect it was at least focused on the task in hand, while the collective head of the present government is somewhere in cloud cuckoo-land, along with that of the Tories. So let us also accept that this time round we, people at large, just have to do the job ourselves – starting with farming, which of all our enterprises is the one in most urgent need of repair.

This article does not attempt to deal with everything that agriculture needs. It addresses just one prime issue: how to enable and encourage people to get back on to the land; how to increase the present force of expert farmers by at least ten fold; and how to do this without government help, and indeed in the face of policy and law that get in the way.

The general task is to create a career ladder – a route whereby people born and bred in cities who don’t know that potatoes don’t grow on trees or that milk comes from cows, come to realize that farming is what they always wanted to do, and then start doing it.  It would be pretentious even by my standards to point out that the Buddha defined an eight-fold path to enlightenment – but it happens serendipitously that in this context too we can envisage eight plausible steps that could achieve what’s needed. Thus:

STEP 1: A NEW GENERATION OF FARMING WANNABES

First we need a critical mass of wannabes: people (especially young people) who really do want to farm. In truth a critical mass does not have to be large. About 10 per cent of the whole will do – and I reckon that that 10 per cent already exists. It’s just a matter of identifying them.

Crucially, we need to raise the status of farmers – again to make it a respected, high kudos pursuit. Necessity is a great spur, as shown in the siege economy of Cuba, where farmers and growers now rank with doctors in the social pecking order. Fashion is at least equally powerful, perhaps more so – as shown in Britain by the social rise of the chef, albeit inspired in large part by the celebs. In traditional France, as Jane Grigson was wont to recall, every small child wanted to own a restaurant – just as every small boy in Britain wanted to be an engine driver. Both jobs offered autonomy: the sense of being in charge of one’s own destiny, which in this alleged age of choice and freedom becomes increasingly rare.

Tony Blair’s great adage is relevant here – “Education, education, education”. As an innovative farming friend said to me recently – “When we show people round the farm the kids get the point immediately, while the adults tend to look blank”.  (And to paraphrase Max Planck, you can’t teach new ideas to old physicists. You just have to wait for them to die).

Once the wannabes exist, what can they actually do?

STEP 2: THE CONCERNED OBSERVER

A conserved observer could just be an informed consumer.  I hate the word “consumer” and the idea of the consumer society (we used to think we were citizens) – but consumers qua consumers are powerful nonetheless. It really is true that if no-one went to Tesco, then Tesco would disappear. It’s also absolutely the case that if enough people cared enough about food to buy good stuff from excellent farmers, then excellence would flourish – while at present best practice is punished by the mania for low prices. So a discerning consumer is not a bad thing to be. More generally, we need to resurrect the food culture, so that people take it for granted that good food is worth paying for. The Slow Food Movement is surely of supreme importance in this. It has become a major force in Italy and is catching on worldwide,

The next conceptual step is to become the kind of person who supports some version of CSA – “Community Supported Agriculture”. An outstanding example is the Stroud Community Agriculture project in Gloucestershire (again with excellent information on Google). Another excellent model – which we hope to be reporting on at length – is that of Tim Waygood at Agrarian Renaissance, based at Church Farm, in Hertfordshire (just type “agrarian renaissance” into Google). As soon as things can get organized, Tim intends to invite people at large to invest directly in Agrarian Renaissance with all kinds of benefits – a true business partnership with people at large, with a strong social purpose. (Business can work for the public good. It doesn’t have to be an exercise in predation). My own outfit, currently called LandShare, is trying directly to influence events, and will need money sooner or later. In general, the present economy is not good, and we need a new one. But the point of Renaissance is to use existing resources – including existing cash – to create something better.

Anyway: the “concerned observer” can take many forms – and is a good and useful thing to be. It is of course highly regrettable that many people who would be concerned observers just don’t have time to be discerning. Or else they live in places where it’s Tesco or nothing.

STEP 3: THE ALLOTMENTEER

Get an allotment. Or grow vegetables and fruit in the garden. If you haven’t got a garden, grow things in a patio. Put up a small conservatory. Or a balcony. Or a window box. All schools should have school gardens, not just for vegetables but for plants in general. I was lured into this whole business largely by cacti, which in the mid 1950s were still rare and expensive (sometimes costing as much as three shillings and sixpence).

The point is not, actually, to grow all your own food. You can’t feed a family from a window box but it is still worthwhile. What matters is to get the hands dirty. All good experimental scientists as opposed to pencil-pushing theoreticians speak of the need for feel. In food above all in this age of packaging and freezing we need to recapture our sense of feel for what food is and how it grows. There is no understanding without feel.

Again, there are many movements afoot of many kinds to get people growing (not least the one developed on Channel 4 by Hugh Fearnley Whittingstall, also confusingly called Landshare, though our own LandShare was established first).  Step 3, in fact, is well in hand – but needs building on!

STEP 4: HORTICULTURE-PLUS

At present, in most societies including Britain, horticulture is still seen as a nutritional side-line: a way of providing micro-nutrients, some fibre, and texture and flavour for aesthetic purposes. The burden of nourishment is left to the staple crops, grown on the arable (field) scale – cereals plus tubers (potatoes) and pulses (peas and beans); and livestock.

But the emphasis should shift. Many horticultural crops are or can supply macro-nutrients too (energy and protein) – and some crops that are now generally grown on the arable scale began as horticultural crops and still play a huge part in gardening, or should do so again. Thus the small-scale grower can produce very useful quantities of potatoes and pulses. More interestingly, Oxford-based farmer and scientist John Letts (who I hope will be writing for this website) argues that cereals can and should be grown as allotment crops, as well as in the field. In all cases, of course, the gardener can focus on the rare varieties that often have special merits, while the conventional arable farmer feels obliged to focus on yield and uniformity (which is necessary for mass harvesting). In short, we should not reflexly assume that serious nutrition means arable and livestock. Horticulture has far more to offer than is commonly supposed.

One further refinement. Martin Wolfe, Suffolk farmer and scientist, argues that all farming should be seen as an exercise in agro-forestry – meaning that all farming should integrate trees not simply for adornment or as pheasant cover but as a serious component of the whole system. Again, this blog will be looking far more closely at Martin’s work (and try “Wakelyns Agroforestry” on Google). Indeed, one of the few encouraging signs worldwide is the apparently increasing emphasis on agroforestry in all regions – upland, lowland, tropical wet, tropical dry, temperate. Once growers start to integrate trees or shrubs in whatever form, they have crossed a barrier into a more intricate and integrated form of husbandry.

STEP 5: LIVESTOCK

There is no compulsion to expand from plants into livestock. Many professional farms and small-holdings are plants-only. But despite what the vegans tell us, there is no farm enterprise that could not be made more efficient and resilient by including a few animals of the right kinds, in the right numbers. Small-scale livestock – first poultry, then pigs, and then perhaps aquaculture (from tilapia to catfish to carp, depending on climate – and many more) work beautifully with horticulture; while the grazing animals – mainly sheep and cattle, the ruminants – integrate beautifully with arable and horticulture on the larger scale. With livestock, too, we see the real advantages of agroforestry – for most of the domestic species began as woodland animals (the sheep is the only possible exception) and all benefit enormously, and demonstrably, both from the shade and from the micronutrients offered by browse.

But livestock are a bigger commitment. You can let plants die with a fairly clear conscience but you can’t let animals die.

Again there are established routes in, showing how the great leap into hands-on farming can be made. One such is the farm run collectively by the villagers of Martin in Hampshire, masterminded by Nick Snelgar (and again there are many intriguing articles on all this on Google under “Nick Snelgar” and “future farms”). Again we will have more to say on Future Farms in the future (and it would be good to learn of comparable exercises elsewhere. The whole point is to spread the word).

STEP 6: FROM ALLOTMENTEER TO INFORMAL FARMER: THE ABSOLUTE IMPORTANCE OF MARKETING

Of course, anyone who grows things can decide to stay where they are any time they like. There is no compulsion to expand. But at least with a following wind the allotmenteer who really gets the bug might grow by degrees until he or she are raising far too much for family and friends. If the expanded allotmenteer can only arrange to sell the surplus then by any definition that is not simply legalistic, he or she is a bona fide farmer.

At this point, marketing becomes a key issue. Small-scale farmers cannot realistically sell to the supermarkets. Some supermarkets sometimes buy some stuff from small local farmers but this is not what their business is about, and such concessions in truth are window-dressing. The people’s takeover requires an alternative food chain. Farmers’ markets may provide a short-term option – indeed at present they are very important.  But in the long term they cannot be the answer.

The alternative food chain requires a whole separate study and will be a prime concern of this blog.

STEP 7: THE COMMITTED PART-TIMER

Once formal marketing begins, the enthusiast de facto is a part-time farmer.  Part-time farming is, I reckon, the key to a great deal. Much of the best farming in the world has been and is by part-timers. Part-time farming is a serious business. We can already see all kinds of precedents, from the hobby farms of Germany to the dachas of Russia to the crofters who form most of Scotland’s agriculture (in terms of personnel and land), plus countless other examples from all continents. Google again – please try The Scottish Crofting Foundation. I like the idea in passing that some of the Founding Fathers of the United States were farmers, generally excellent farmers, when they weren’t being lawyers and politicians – and providing one of the greatest ever examples of what a people’s Renaissance can be.

Again, part-time farming needs to become a major theme of this website.

Again, too, there are various moves afoot to help interested parties to get moving. These include the FarmStep initiative at the Northmoor Trust in Oxfordshire, and LandScope, being set up by the Dartington Trust, in Devon (try Dartington Hall Trust — LandScope on Google). Again, we hope to write about both of these projects (and more).

STEP 8: THE FULL-TIME FARMER

The world will always need a core of full-time farmers. The key – which is a prime component of the “the new agrarianism” is to make full-time farming agreeable and desirable.  This requires input at every level – practical, theoretical, social, legal, economic, administrative. This is one reason why absolutely everyone can be an active player.

Overall, the world has a crying need for a new economy that is not geared simply to cash and short-term profit, and is not designed simply to be maximally competitive. Such, indeed, is not the economy of serious people who want the world to be a better place. It is the natural arena of spivs and gangsters. But again, serious economists are on the case – and have been for decades. In particular, I hope we will refer again and again in this blog to the work of the New Economics Foundation, in London (which again is eminently Google-able). Again, although economic theory cannot be a major theme of this blog, we need at least to keep a weather eye on events.

The whole shift, from an obsessively urbanized and industrialized society to one that has a proper balance of urban and agrarian, and with an economy designed to secure that balance, may truly be called The Agrarian Renaissance.


A World View for Wild Law

The following is based on a talk to the United Kingdom Environmental Law Association(UKELA) workshop in the Lee Valley on 24th – 26th September 2010. The attitude to life that it seeks to define seems highly appropriate to the concept of Enlightened Agriculture.

Laws concentrate the mind wonderfully but they don’t really work – and perhaps they’re not even good laws – unless they reflect the zeitgeist – the spirit of the age; what Thomas Kuhn called the prevailing “paradigm”. If we really want wild law to work we need to work on the underlying paradigm – on how we view our fellow creatures, and on whether we feel in our depths that the Earth as a whole must be treated with reverence, or is a bonanza, a “resource”, to be turned into commodities with all possible speed, for the exclusive comfort and enrichment of human beings. At the moment, the latter view prevails.

Those who most firmly espouse the current paradigm seem supremely self-confident because, they feel, they can’t possibly be wrong. The paradigm is ultimately rigorous, no-nonsense. It is rooted in fact of the most solid kind – observation, made repeatedly, in this way and that, with the most refined instruments. Ideas – hypotheses – are floated to explain those facts and then those hypotheses are tested to breaking point, and then tested again, with every step subject to “peer review”. The whole procedure is inexorably logical and the logic is that of mathematics – absolutely precise, absolutely explicit.

Thus the truth about the world is discovered by method, and the method is ultimately “robust”. The method is, in fact, that of science; and science is seen as the ultimate manifestation, the apotheosis, of rationality. The conclusions reached by this rational process must surely be true – for what else can truth possibly be? Indeed, some suggest, this inexorable process of discovery is leading us towards omniscience; and omniscience will, in effect, make us omnipotent. We human beings – known rhetorically as “Man” – are masters of all we survey, and of our own destiny.  The triumphs of modern science, manifest not least in the “high” technologies that have emerged from our new, scientific appreciation of natural forces, show us beyond all reasonable doubt that the theories of science must be right.

Contrariwise, every other way of looking at the world is flawed: woolly and arbitrary; reliant on intuition and on emotional response; making claims and assumptions that are not rooted unambiguously in observation, and cannot be subjected to the logic of maths – and indeed cannot be rigorously investigated at all. Since science is perceived as the ultimate exercize in investigative rigour, it follows that every idea that is not rooted in science must fall short, and should be thrown out. Such ideas and the attitudes they give rise to are seen to be anachronistic; a vestige of our primitive past; notions evolved within our hunting-gathering Pliocene-Pleistocene ancestors on the plains of Africa. Their function was not and is not to guide us toward deep truth, but simply to keep us alive from day to day. For day-to-day purposes, all kinds of woolly assumptions may serve us well enough. The point is to avoid hyaenas – not to get understand them in any depth, or the universe as a whole. But for present, modern purposes – seven billion people bent on intellectual and material “progress” – untested assumptions will not do. We need rigour.

Out of this no-nonsense, self-confident view of the world has emerged, as night follows day, a whole host of notions that bear directly on all aspects of our lives – and, through us, on the lives of all creatures and the fabric of the Earth itself. Thus the modern worldview is entirely materialistic and mechanistic. In the 14th century when clockwork itself was new the universe was said (by some) to work like clockwork. Seventeenth century natural philosophers including Isaac Newton suggested that “natural law” kept the universe on track. The 17th century greats were all devout, and took it to be self-evident there could be no laws without a law-maker, meaning God: but as the Enlightenment got underway in the 18th century, the law-maker began to seem superfluous, and atheism emerged as a respectable philosophy. Atheism is justified in large part by appealing to the principle of “Occam’s Razor”. This is the name given to the famous declaration from William of Occam (or Ockham) in the 14th century: “Non sunt multiplicanda entia praetor necessitatem” — “Entities are not to be multiplied beyond necessity”. In other words, our explanations of the universe and everything within it should be based on things we know (or think we know) – such as the known laws of physics. We shouldn’t drag in extraneous, hypothetical factors ad hoc to fill in the gaps. William was a Franciscan friar and took it to be self-evident that God Himself was a very necessary component of the universe. But modern atheists have recruited him to their cause and used his rule to boot out God as well.

Consciousness, closely related to the concept of mind, is another victim of materialist-atheist rigour, hacked away by the Occamist razor. People tend to think of mind as a phenomenon in its own right – Descartes saw mind and matter as the twin components of the universe. But in the materialist world there is only matter, and consciousness (as the American philosopher Dan Dennet has explained at length) is just the noise that neurons make, neurons being the cells of which nerves and brains are made. In the materialist-mechanist view of things the brain is compared to a computer. Human minds are advanced computers, practicing massively parallel processing. Computers can’t yet replicate the human mind, but they soon will. Human emotions are a way of setting the computer into a new mode. The sensation of emotion – what we feel – is a kind of illusion; how the brain registers the change of mode. The minds of other animals are similar in principle but much cruder. Through much of the 20th century animal psychology, at least of the kind that was considered scientific, was dominated by “behaviourism”. Everything an animal did (and indeed what humans did) was explained by variations on a theme of reflex and response, with reflexes compared to electric circuitry.  The arch champion of behaviourism, B F Skinner, even invoked behaviourist mechanisms to explain the way that humans learn language – he suggested that children are rewarded for using the appropriate words. Then Noam Chomsky pointed out as any parent must surely have known that the behaviourist mechanisms he proposed had nothing to do with reality.

It follows too from all this that the materialist-atheist view is entirely anthropocentric. There is no God to judge us. We ourselves are our only judges. We indeed are the only thinking creatures, and even we are just computers on legs. Furthermore, the universe is just stuff so we may as well just treat it as stuff.

Crucial to the modern paradigm is Charles Darwin’s idea of evolution by means of natural selection, dating from the mid 19th century. My own education is in biology and Darwin has always been one of my heroes. He was, after all, a fine gentleman, a family man, and a humanitarian who in maturity looked after the local villagers and as a young man out-faced Brazilian slave-owners (and his ship’s captain) during his voyage on The Beagle. He is also recognized as one of the greatest field biologists of all time; and the book in which he formally laid out his ideas on evolution, On the Origin of Species by Means of Natural Selection, belongs in a very short shortlist of books that have truly transformed human understanding, of just about everything. He is worthy to be anyone’s hero.

But Darwin was a human being as all scientists are although some try not to be, and like all human beings he was a child of his own time; and he grew up in turbulent and dangerous times that gave rise to enormous pessimism on all fronts. In particular, in the late 18th century, with repeats in the early 19th, the gloomy economist-cleric Thomas Robert Malthus (known cosily as Bob) predicted that the human species was bound to crash since rising numbers were bound to outstrip the food supply. The early years of the industrial revolution brought rising urban misery, and the harvest failure of 1815-16, hard on the heels of the Napoleonic wars, seemed to confirm his prognostications. Tennyson in the 1830s wrote of “Nature red in tooth and claw”. At the same time, even before Victoria, orthodox Christian theology, and the certainty that went with it, began to lose its hold. Later, Matthew Arnold compared the decline of religious orthodoxy with the sound of the retreating sea:  “… now I only hear Its melancholy, long, withdrawing roar”.

Darwin grew up with all this. As a naturalist, he also perceived that Malthus’s misgivings about the human race must apply in principle to all living creatures: all tend to out-breed their resources. Therefore, he said, all creatures are obliged to compete for what there is – “the struggle for existence”. The ones that are best adapted to the prevailing conditions are most likely to win the competition and have offspring of their own. Hence “natural selection” – nature selects the ones that are best suited, best adapted, to their surroundings. As Herbert Spencer then put the matter, natural selection leads to “survival of the fittest” (where fittest has the Victorian meaning of “most apt”): an aphorism that Darwin later adopted.

Natural selection shapes lineages of creatures as the generations pass. This shaping, “descent with modification”, is evolution. It has made all creatures – including ourselves – what we are. Natural selection does not truly create but, by selecting from what is on offer, it acts creatively. Yet it is driven by competition. Life is perceived as one long punch-up from conception to the compost heap. Nature is indeed red in tooth and claw.

In the mid 20th century a small group of biologists in Britain and the US suggested that natural selection does not work primarily on individual creatures, as Darwin himself envisaged, but on individual genes. Each gene is engaged in perpetual struggle, both with other genes in the same genome, and with other genes in other individuals. By the mid 20th century it was known that genes are made of DNA. They are chemistry, in short; refined chemistry, but chemistry nonetheless. Their special property is that they can replicate. Lebensraum and resources are limited for genes just as they are for whole creatures, so there is competition between them. The genes that replicate most efficiently, survive. But individual genes, mere bits of DNA, are of course mindless, and have no sense of purpose. They just replicate – or not, depending on how things turn out. They are blind, dumb, and deaf, oblivious of all around them; or as Richard Dawkins memorably if deceptively put the matter, they are simply “selfish”. The Selfish Gene, published in 1976, has been one of the biggest best-sellers. Whole creatures, said Dawkins, including us, are mere “vehicles” for our selfish DNA.

Since the 1970s the notion has been further elaborated. Evolutionary psychologists have sought to explain all animal including human behaviour in genetic terms; some genes promote the kind of behaviour that enables their possessor to compete and to survive, and some do not. The reason why some genes are more likely to survive than others is analyzed in terms of game theory. Richard Dawkins has called this mode of analysis “ultradarwinian”. Here, surely, is ultimate understanding: human behaviour explained by chemistry and maths. Tony Blair tells us in his memoirs that he likes to “drill down” to the truth. Well, you can’t drill much deeper than this.

So we have the prevailing worldview: materialistic; mechanistic; ultimately “rationalist”; and atheistic. It is inspired by the general notion of natural law and by Darwin’s particular notion of natural selection – which is rooted in the concept of struggle and of competition; and all is now explained by the chemistry of DNA and by game theory. Overall, as Richard Dawkins has expressly informed us, the universe is bleak and indifferent, not to say pitiless, and we are just chemistry, like everything else.

This worldview in turn, inevitably, has spilled over into everyday life: our attitude to morality; to the economy; to each other and to the world as a whole. Our overall position is of course to be anthropocentric because human beings are part of the great competitive process and there is no reason in Heaven or Earth why human beings should acknowledge the rights or even the presence of anything else except insofar as it contributes to our own survival. If we behave morally, which implies unselfishly and indeed altruistically, helping others even at cost to ourselves (as opposed merely through enlightened self-interest), then this can be only because we choose, as thinking beings, to override the clamourings of our own genes: or, as earlier generations might have put the matter, to “rise above nature”.

Morality becomes an exercize in utilitarianism; and since everything nowadays is quantified, utilitarianism becomes an exercize in cost effectiveness.

The economy that emerges from all this is rooted, naturally, entirely in personal advantage where personal advantage is measured in material gain. Economists conventionally take it as their premise that human beings benefit and are made happy by acquisition; and this is taken in a simplistic way to be a truth rooted in biology. More specifically, the ultimately competitive neoliberal market is seen to be essentially Darwinian – or indeed ultradarwinian. Philosophers and sages have been warning us at least since St Paul that behaviour that is natural isn’t necessarily right, morally – a sentiment most famously summarized by David Hume in the 18th century who pointed out (although this is a paraphrase) that we cannot derive an “ought” from an “is”. G E Moore at the turn of the 20th century spoke of “the naturalistic fallacy”. In practice, though, we do tend to judge good behaviour in part by how natural we perceive it to be – certainly we condemn what we perceive to be “unnatural” behaviour. (Cardinal Newman pointed out in the mid 19th century that although Hume was right in principle – there is no logical connection between what is and what ought to be — in practice many different lines of thought lead us from one to the other).

If, then, the world itself is naturally unsympathetic, competitive to the death, it seems perfectly reasonable to suggest that our present, neoliberal economy — based on uncompromising competition — is also perfectly natural, and so is perfectly acceptable, and indeed is right. Thus I remember an Enron director who ran off to Ohio or some such state with a billion dollars or so of the investors’ loot claiming that this was fine because he had just read Selfish Gene and learned that human beings are, like everything else, naturally selfish and so it was OK. This is feeble logic and not at all what Dawkins intended but we can see nonetheless how this idea arose. Indeed I know an Oxford zoology DPhil who earns his living, or much of it, telling big business that the neoliberal way is the Darwinian way and therefore is nature’s way and therefore is OK.

Since this whole paradigm is rooted in undeniable observation, elaborated by thinking that is unimpeachably logical and indeed mathematical, it cannot possibly be wrong, or so its protagonists suppose. The universe is indifferent. We are machines, like everything else, driven by our innate impulsion to out-compete our fellows and to maximize our share of the resources, and the resources are the world at large and our fellow creatures. This may not be a cheering worldview, but we just to get used to it. Truth in the end is best, no matter how unpalatable. Everything else is wishful thinking.

There are, however, good reasons for doubting whether the oh-so confident worldview of the hard-liners is really as solid as it may seem.  Here are few of those reasons.

How much can science really tell us?

At first glance, the 20th century seemed to vindicate the hopes of the 18th century Enlightenment: that rationality in general and science in particular can and in time will tell us all that there is to know. The matter was sewn up – or so it seemed – at the start of the century by the logical positivists. In essence (though this of course is a fairly vicious paraphrase) they said that statements that cannot be proved quite simply have no meaning. The only statements that can be proved, they suggested, were those that have to do with the material universe – theories that relate to things we can see and stub our toes on. This, they took to be obvious, is the stuff of science. Ergo, the statements of science have meaning, and all the rest is idle wool-gathering. This means in effect that science is the only road to truth, and is the sole arbiter of truth. Clearly the philosophy of logical positivism rested heavily on science; and many scientists in turn became logical positivists. Logical positivism formally died a death by about the 1970s but those who espouse the modern, prevailing way of looking at the world are logical positivists, although they tend not to use the term. Instead they equate science with rationalism, and simply think of themselves as rationalists.

Various lines of thought knocked logical positivism off its perch. In particular, Karl Popper in the 1930s pointed out that no empirical statement about the universe can be proven beyond all possible doubt, or even beyond all reasonable doubt. All the theories of science are provisional statements, waiting to be knocked off their perch – either refuted, or else subsumed within some larger idea. The classic example is that of Newton’s mechanics, which cannot possibly be wrong – and indeed is surely not wrong. Yet, following Einstein, we can now see that Newton’s laws apply only to middle-sized objects moving at middling speeds. In particular, as we approach the speed of light, the rules change. More broadly, as J S Mill pointed out in the 19th century, no matter how much we know, there could always be things we simply haven’t thought of: what Donald Rumsfeld famously called “unknown unknowns”. Science, in short, does not and cannot provide the royal road to truth, and certainly not to the whole truth, and nothing but the truth.

Popper’s reservations were reinforced by Kurt Goedel (although Goedel historically preceded Popper). For the theories of science are rooted in the end in maths, which ever since Pythagoras has been seen to be unimpeachable. But Goedel showed that maths itself isn’t quite so robust as it seems. He proved (for the abstractions of maths can be proved) that every mathematical statement that is not simply a matter of definition (as in 2 + 2 = 4) must contain elements that are themselves unproveable.

The limitation of science was finally summarized by one of the mid-20th century’s great scientists, the zoologist Sir Peter Medawar. He simply borrowed a phrase from Bismarck and pointed out that “Science is the art of the soluble”. In other words, scientists are careful to address only those questions they think they have a reasonable chance of answering (to their peers’ satisfactions) in the time and with the tools and concepts available. They provide certainty, or the semblance of it, only by carefully tailoring the questions. In short, the whole of science emerges as a giant tautology.

Finally, late 20th century philosophers began to point out that all human understanding is narrative: a story that we tell ourselves. What we take to be true is a story that we happen to find convincing at any one time. Science is a narrative too. It is not the inexorable edifice of incontrovertible truth, as it is commonly presented. It is more like a landscape painting, painted by a thousand hands, and never finished.

Science is wonderful and has become essential. Everyone should know science. But its insights, like all human insights, are strictly limited – partly by the limits of human understanding but partly, too, as a matter of strategy: science does not venture into areas that will not yield to its methods. Some seem content with the limited worldview that science can provide. Above all, they cling to the illusion of certainty. But most people are not content with a worldview that is so deliberately truncated. So where else can we look for insight?

The absolute importance of intuition

Broadly speaking, I reckon we can identify two routes to what we think of as truth. One indeed is rationality, as taken to an extreme by science. The other is intuition – things that we feel in our bones to be true. Hard-line materialist scientists reject intuition as a source of insight – but they are surely wrong to do so. In reality, scientists themselves rely very heavily (in the end absolutely) upon their intuition. Paul Dirac used to judge the truth or otherwise of his equations by their beauty; and in the end all scientists do something like this. Their rationalizing tells them which of the various hypotheses is statistically most likely to true but in the end their prehension of truth is an emotional response: what they take to be true is what they feel to be true. I feel that intuition is the sum total of our evolved responses to the problems posed by the universe – evolved that is in our human and pre-human ancestors, over many millions of years. Our intuitions may sometimes lead us astray to be sure but overall, there is no obvious reason to mistrust it. Yet intuition can be refined – and this, as many a theologian has pointed out, is the role of religion. Both among Christian and Islamic teachers we find a constant dialogue between intuition on the one hand and reason on the other. Wisdom requires both.

Conscious animals

One line of late 20th century thinking that is directly pertinent is that of animal consciousness. In the 17th century Rene Descartes peremptorily declared that animals cannot think because thought depends on words and animals don’t use words. Since they don’t think they cannot be conscious, and therefore they don’t really have feelings – for what are feelings if we are not conscious of them? Animals may yelp with pain but only in the way that a machine may protest if you push it beyond its limits. Logical positivism is in line with Descartes’ way of thinking – for it is essentially a minimalist view of reality: take seriously only what can be measured (with Occam’s metaphorical razor wielded like a machete). All these lines of thinking fed into the behaviourist agenda: animal psychology analysed entirely in terms of reflex; stimulus-response (a mechanical concept), modified – “conditioned” – by reward and punishment. As we have seen, the doyen of behaviourism, B F Skinner, even tried to explain the acquisition of human language in terms of conditioned reflex – children rewarded for using the right words.

Some, however, always doubted whether behaviourism really could explain animal behaviour satisfactorily – not the least being Konrad Lorenz, who spoke of the need to empathize with animals in order to understand them; and you can’t empathize with a machine. Noam Chomsky pointed out that humans certainly do not acquire language by learning new reflexes. Jane Goodall showed that the behaviour of wild chimps (as opposed to those who were merely required to solve puzzles set by scientists) cannot be explained except by supposing that they possess the kind of attributes that we think are peculiar to ourselves – including a broad palette of emotions and the ability to reason. As the Cambridge psychologist Pat Bateson commented in the 1980s – “Anthropomorphism can be heuristic”, meaning that we cannot properly understand animals unless we begin by ascribing human characteristics to them. New York psychologist Herb Terrace said that the task, now, is to understand how animals think even though they don’t have verbal language.

Since the 1980s scientists have begun to appreciate the subtleties of animals more and more. The Dutch biologist Frans de Waal in particular has written of the politics of chimpanzees. The Anglo-American primatologist Jennifer Scott has described “Machiavellian” behaviour in gorillas. Empathy between animals, and between animals and people is now a respectable line of research, not least among biologists interested in animal welfare, such as Francoise Welmsfelder of Edinburgh University. Empathy is an essential prerequisite of compassion.

People who work with animals, or have pets, or indeed just watch them in the fields, know in their bones that they reason and have feelings. It is the intuitive response to them. We have allowed ourselves to be talked out of this bone feeling by a trail of reasoning that seemed eminently rational – and allowed ourselves therefore to be talked out of the compassion towards animals that ought to come naturally. To be sure, the new insights into the psychology of animals come from science, just as behaviourism did. But they show us that our intuitions were right all along; and how dangerous it is to do as we have been bidden, and to override our intuitions with mere ratiocination.

The rise of virtue ethics

Morality, in the modern, mechanistic, ultradarwinian paradigm, has suffered horribly. Utilitarianism, alias consequentialism, has ruled; goodness and badness judged purely by outcome. Outcome in turn in this age of accountancy is quantified like everything else – so ethics has become an exercize in cost-effectiveness. Indeed, in the neoliberal economy that has sprung from the ultradarwinian, materialist worldview, morality is defined by the market itself. Whatever people are prepared to pay for, is deemed to be OK; and what they pay most for is deemed to be best. I have even heard human cloning justified on the grounds that there would be a “market” for it.

But alongside, in recent decades, we have also seen new interest in virtue ethics: morality judged according to attitude. Critics have often suggested that the great religions disagree on moral issues so they can’t all be right and they all deserve to be distrusted. But the differences between them are mostly those of manners and custom. At their core, as the 19th century Hindu mystic Ramakrishna pointed out, all the great religions agree on three fundamental ingredients of morality: personal humility; respect for fellow, sentient beings; and a sense of reverence for the universe as a whole. In one word, the Christians speak of love; and the Buddhists emphasize compassion. This, we intuitively feel, is indeed the core of morality; and reason tells us that it is very foolish indeed to override our intuition, especially when the intuition is so widely shared that it seems to be the human norm.

A truly modern paradigm and Wild Law

For a whole raft of reasons – reasons that are rooted very firmly in reason – we should mistrust what has become the prevailing paradigm: ultra-“rationalist”; scientistic; materialist; mechanist; atheist; the notion that human beings above all crave power and possessions, and are made happy by those things; a neoliberal economy based on personal acquisition and competitiveness – which is mistakenly taken to be “natural”; anthropocentrism; and a tendency to see the fabric of the Earth and our fellow creatures as resources, to be turned into commodities.

We need as a matter of urgency to embrace a more traditional worldview, rooted in intuition, though tempered by rationality; one that roots morality in virtue – and in particular in the virtues of humility, respect, and reverence; one in which we see other species truly in the way that St Francis saw them – as our fellow creatures; fellow, sentient beings with as much intrinsic right to be here as any of us.

If we had such an attitude we would not need Wild Law – or rather we would; but the point of that law (which in the end is the main point of laws in general) would simply be to make explicit what we all feel in our bones to be obvious.

In the meantime we need Wild Law as a heuristic device: as way of focusing the mind on what ought to be obvious but has somehow gone missing.


Three Simple Stats that Change Everything

Here are three simple statistics that seem to me to put all of agriculture into a quite new perspective. They out of the water any suggestion that industrial agriculture is what feeds the world, and that we need more and more of it: and makes complete nonsense of the present enthusiasm for “high” technologies such as GM and livestock cloning (and even, God save us, nanotechnology). All three stats are from reliable sources although two of them need further clarification. So:

1: The world population will rise to 9.3 billion by 2050 and then stabilize. Numbers are likely to remain high for some decades or centuries after that but should then decline – not because of disaster but because that is the way demography works.

This means that the traditional Malthusian prediction – that human numbers are bound to go on rising until there is the most almighty crash – is just not true. Demography has moved on since the late 18th century, when Thomas Robert Malthus first made this grisly prediction.  This means that for the first time in 10,000 years the task of feeding people can be seen to be finite. Things will not go on getting worse.

All this is excellently discussed by Fred Pearce in Peoplequake, reviewed in this website under “Must Read”.

2: The total area of agricultural land worldwide is 4.9 billion hectares.

Many would dispute this figure, and much of the agricultural land is degraded. But it comes from Sir John Beddington’s latest report The Future of Food, put together by 400 experts over many months, so if it is not right, it ought to be.

This means that even when the world population reaches its peak (9.3 billion by 2050) there should still be more than half a hectare of agricultural land per person: two people per hectare.

Is it possible to feed two people per hectare? Well, the average wheat yield in Britain now is 8 tonnes per hectare – enough to provide 24 adults with all the energy and protein (ie the macronutrients) that they need; 12 times the average that’s required. Of course, many places are far less productive than the arable fields of Britain. But some places – particularly in the traditional, small mixed farms of SE Asia – provide far more food than Berkshire can, or even East Anglia – and provide all the vitamins and minerals as well. Even the beleaguered farmers of the Sahel, among the poorest and most neglected on Earth, realistically aim to produce about one tonne of sorghum per hectare – which  again is more than enough to feed two people. Beef and sheep on extensive grassland are likely to produce less than this – but any shortfall is more than made good by the land that is actually cultivated.

So what is supposed to be the problem?

3: Traditional farming – small, mixed, labour-intensive, low-input farms – provide 70 per cent of the world’s food.

This statistic is taken from Securing Future Food: towards ecological food provision, a UK Food Group Briefing (available on line at www.ukfg.org.uk/ecological_food_provision.php). The figire is supported by a whole line of argument from Food Rebellions!, by Eric Holt-Gimenez and Raj Patel, (Pambazuka Press, Foodfirst Books, and Grassroots International, Cape Town, 2009). This tells us, for example, that Africa now has 33 million small farms, mostly farmed by women, and mostly of less than two hectares – yet they are “responsible for 90% of the continent’s agricultural production”.

All the African farmers and serious European Africa hands that I have spoken to agree that traditional African farming is generally excellent, and all it needs is a helping hand; or, indeed, for the rest of the world to get out of the way.

In short, these simple stats show beyond any reasonable doubt that the task of feeding 9.3 billion people to a very high standard just should not be difficult – if that is what we seriously set out to do. The air of panic evident in The Future of Food is simply not justified – or rather it is, but only if we continue with the present strategy, which is to sweep aside the agriculture that now feeds 70 per cent of the world’s people, and replace it with the kind that feed 30 per cent with huge collateral damage including mass unemployment and environmental degradation.

As for GM, cloning, and the other ulta-high-tech geejaws that are now so fashionable, they contribute almost nothing that is worthwhile: nothing that needs doing that could not be done better by other means. (They do, however, make rich companies even richer, and governments like those of Britain can then claim this rising wealth as “economic growth”. This, of course, is the whole thrust of the prevailing global agricultural strategy. Once understand this, and we see why the world is really in a mess).

“THE FUTURE OF FOOD” equals MORE OF THE SAME!

Government Chief Scientist demands “a redesign of the whole food system” but recommends business as usual

There is much to admire in the Chief Government Scientist Sir John Beddington’s long-awaited Foresight report: The Future of Food and Farming: challenges and choices for global sustainability. So there should be. It has, after all, as Sir John tells us, involved “Several hundred experts and stakeholders from across the world … from a wide range of disciplines across the natural and social sciences”.  It tells us, up front, and quite rightly, that we need “decision-making that is fully integrated across a diverse range of policy areas which are all too often considered in isolation, and for action to be based on sound evidence”. Indeed, “Nothing less is required than a redesign of the whole food system to bring sustainability to the fore”; and to achieve this, “Policy options should not be closed off. Throughout, the Project’s Final Report has argued the importance of, within reason, excluding as few as possible different policy options on a priori grounds”. Absolutely! Just what was needed!

But somehow, the report doesn’t quite live up to its own billing.  The report offers very few specifics – and where it does, these serve almost invariably to endorse the status quo. So although we may need an across the board re-think – a “redesign” – we are in essence already on the right lines and mainly need more of the same. There are some startling omissions along the way: entire, key areas of discussion have gone missing. There are internal contradictions, of which the editors seem unaware. A great deal is taken for granted that really must not be taken for granted – including the assumed authority of governments. Overall, the “wide range of disciplines” is not wide enough. Absolutely lacking is what is properly called metaphysics: any examination of what humanity’s attitude to the world as a whole ought to be. Our fellow creatures are summed up as “biodiversity”, the role of which is to furnish new genes for possible future crops, and to provide us with “ecosystem services”.  This is a bureaucrat’s re-think.

Accurately, the report tells us that the world is in a truly disastrous state – rampant erosion, loss of fresh water (with agriculture gobbling up 70% of the world’s “blue water”), loss of oil and the quick and easy energy that it provides, dissipation of phosphorus (which the report doesn’t seem to mention), and above all, anthropogenic global warming – 30 per cent of which results from the food chain. The report rightly warns us not to expand our farming any further – in particular, we must not cut down any more rainforest – and says, therefore, that we need above all to intensify; yet we must do this without collateral damage. Overall, “Investment in research on modern technologies is essential in light of the magnitude of the challenges for food security in the coming decades.”

It all sounds perfectly sensible and indeed incontrovertible – but if we are truly being radical, and talking of “redesign”, then we need to re-think from first principles. If we do that, we can get a different picture.

What’s the problem?

As the report rightly tells us, the world population now stands at 7 billion and on present trends it will rise to about 9.3 billion by 2050 – but then the numbers should level out. So, as the  report points out,  our food problems can at last be seen to be finite – we need to feed 9 billion, and then go on doing so for a few decades or centuries, after which numbers should fall again, not through disaster but just because that’s the way demography works. This is the best news humanity has had since Thomas Robert Malthus told us in the early 19th century that humanity was bound to breed and breed until there is the most almighty crash. No we won’t, probably. The task is definable: to provide good food for nine billion.

Then the report tells us that the world as a whole has about 4.6 billion hectares of land that can reasonably be called agricultural.

But the authors fail to put the two statistics together. For if we have 4.6 billion hectares, and nine billion to feed, then the future task is to feed around two people per hectare – and that really shouldn’t be too difficult. In Britain, for example (just to provide a simple yardstick) the average wheat yield is now around 8 tonnes per hectare. One kilogram of wheat provides around 3000 kcals of food energy, at more than 10 per cent protein – which is more than enough energy and protein to feed an adult for a day. Since there are 365 days in a year, each person requires just over a third of a tonne of wheat  per annum (or the equivalent thereof) – so one hectare, producing 8 tonnes, could provide the macronutrients, the basis of a staple diet, for about 20 people. That’s ten times as much as the average that’s needed. [See the College website under “Statistics”]

Of course, the cosseted, supplemented wheatfields of Britain are far more productive than much of the world. Cattle and sheep, grazing and browsing in semi-desert, hardly produce a hundredth of this. But many systems worldwide are far more productive than Britain’s arable – including the traditional small mixed units of South-East Asia, where rice and horticulture are tightly integrated with fish, ducks, and pigs, and everything grows all year round. Even the arable farmers of the Sahel, who hope to produce about one tonne of sorghum per hectare, are producing enough. Overall, then, we ought to be able to say – “No panic!” The prime task, surely, is simply to encourage good farmers to farm, usually in the way that they do traditionally, and to make it possible for them to do so.  The secondary task is to find ways to enable farmers worldwide to do what they do with minimum collateral damage – but then if you look closely at traditional systems you find that many are wonderfully conservative and wildlife-friendly (as in the traditional mixed farms of South-East Asia); far more so than western high-tech systems. Traditional farming, in short, in all its extraordinary variety, is a good starting point; and where it fails, this is generally for extraneous reasons.

As we will see, the report does acknowledge the role of traditional farmers – whose status needs to be raised to where it once was: “In the African context, [farming] is often seen as old-fashioned, and the preoccupation of previous generations”. Yet the emphasis of the report is the perceived need for what is perceived to be modernity, which special reference to high tech. In fact the report is nothing like so open-minded and even-handed as it professes to be. Most of the time it is vague on specifics – leaving the powers-that-be very wide scope for doing more of what they do already under the broad heading of innovation – but in some areas, it is very specific. In particular it stresses the absolute importance, virtually the sanctity, of the present global economy: finance capitalism (money is all) within the context of the neoliberal free-but-rigged global market.

Neoliberalism rules, OK!

The report professes to dismiss no idea a priori – except the idea that the neoliberal, global market, may not be serving the world’s people and our fellow creatures and the fabric of the Earth itself quite as well as it might. The editors stress the virtues of the market over and over. It is their principal leitmotiv. Thus we are told:

“Food security is best served by fair and fully functioning markets and not by policies to promote self-sufficiency”.

Then we are told (twice), “This Report rejects food self-sufficiency as a viable option for nations to contribute to global food security … it is important to avoid the introduction of export bans at time of food stress, something that almost certainly exacerbated the 2007 – 2008 food price spike.”

If anyone should step out of line, “Greater powers need to be given to international institutions to prevent trade restrictions at times of crisis.”

To be sure, “Concerns have been raised regarding the exercise of this concentration of corporate power, for example in retail markets and purchase contracts with suppliers (particularly smaller farmers) … However, there does not seem to be an argument for intervention to influence the number of companies in each area or how they operate – provided that the current numbers of major companies in each area and region of the food system were not to contract to a level where competition was threatened, and provided that all organisations adhere to high international standards of corporate governance.” How foolish to suppose that Tesco, and Cargill, and Monsanto might be gaining too much power!

But I don’t know anyone who has ever talked seriously about “self-sufficiency” in the context of national food security: except perhaps Mao Tse Tung, who felt (very reasonably) that China was beleaguered and felt the need to cut off the economy all together; and, possibly, these days, Cuba – which has also become isolated although not, it seems, through choice. What people do talk about is self-reliance – a quite different concept. Self-sufficiency means that a country elects to produce absolutely everything its people need from within its own borders, eschewing all trade – and this is indeed a precarious and counter-productive strategy except in times of siege, military or economic, when the country has no alternative.

Self reliance means simply that a country should elect to produce enough food to provide its own people with a basic diet, growing the staple crops that it is able to grow best. Of course all countries, ideally, should trade in food – both selling and importing. But none should be absolutely dependent on that trade to keep their people alive. Countries should not, as now, be obliged to buy staples from foreign powers to feed themselves – especially when the foreign supply fluctuates so violently in price, and the exporting country does not necessarily have the best interests of the importing country at heart. The literature drips with accounts of people worldwide brought to the point of starvation, and certainly of despair, because they can no longer produce or have access to their own food: because their own staple agriculture has been run down to make way for commodity crops to be sold abroad for money which never gets back to the communities who have been robbed and would do them very little good if it did because the imported food is too dear. We might point out (a little historical evidence) that at the time of the Irish potato famine of the 1840s which halved the population (through starvation and emigration), the barns of Ireland were stuffed with oats. But the oats were contracted for English horses. Just as the present report recommends, the English government of the time, via its Anglo-Irish overlords, was careful not “to prevent trade restrictions at times of crisis.” There is nothing new under the Sun, not even free trade, and the horrors of it have been revealed time and time again. Somehow, however, when the free market is at stake, the demand for “sound evidence” is put on hold. Never let the facts spoil a good dogma.

In truth, those who do question the wisdom and efficacy of the neoliberal market are liable to be written of as commies or greenies or hippies. But this in large part is straightforward nonsense. The present global economy, a frantically and ultra-competitive trade in money, with everything real conceived as a commodity, is an aberration; a perversion of what capitalism ought to be; and is hated by many a traditional businessperson, or indeed by many a traditional Tory and Republican, as vehemently as by any leftie. Quite rightly, those traditionalists see the present economy as a betrayal of what ought to have been a good idea – for capitalism within a framework of common sense and common morality could probably serve the world very well. What we have now, obviously does not. But all this, the report leaves unquestioned.  In truth, it does not dismiss the alternatives a priori. It never even considers them at all.

This brings us to the crux. Agriculture at present is pulled two ways – by the need to produce good food for all, without wrecking the rest of the world; and by the perceived need to preserve the economic and political status quo. The need to produce good food is real. The need to preserve the economic and political status quo is an invention, a dogma, conceived and defended to the death by people who are doing well out of it. If we are really to be radical, and to redesign, we have to explore this conflict head-on.

Biological necessity versus economic dogma

It should indeed be possible to provide good food for two people per hectare over all the world’s agricultural land, and to do so sustainably – without wrecking the rest; and resiliently – able to change direction when conditions change. Indeed this should be straightforward. But we have to treat the task as an exercize in applied biology, or indeed in ecology, and operate within the basic principles of biology. This, we are emphatically not doing – indeed the demands of the present economy pull us in the opposite direction. The thrust of present-day science-based (“high”) technology is not to enable us to work within the limits set by biology, but to override biology, up to and including the creation of quite new organisms. This may be profitable, or at least potentially so, but it is innately foolish. The objections do not spring from woolly-mindedness, or the irrational fears of “the public”, as the high technologists like to suggest. The best modern science – the kind that truly acknowledges how the world really works – shows how ridiculous it is. It also suggests a whole suite of alternatives. But these alternatives are routinely ignored or (as with the report’s conflation of self-reliance and self-sufficiency) are misrepresented.

For, to cut a long story short, there is plenty of reason – plenty of evidence – to suggest that the goals of productivity, sustainability, and resilience, are best served by systems – wild or agricultural – that are very diverse and very integrated, with all the different species in synergy one with another. To minimize collateral damage and reduce the strain on the rest of the world (key components of sustainability) the systems should be minimum-input. As much of the necessary inputs as possible should originate within the system itself, with maximum re-cycling.

These are the broad biological principles that emerge from wild ecosystems – but they apply just as well to farms; for a farm is an artifice, but this too should be conceived as an ecosystem, if it is truly to combine productivity with sustainability. The diversity of nature translates into polyculture – mixed farming. The synergy of nature translates into integration – the traditional balance of crops and livestock, adult cattle and calves and sheep, with pigs and chickens to clear up, and so on. The arch-exponents of minimum-input farming are the organic farmers; and, contrary to the most widely bruited opinion, yields from organic holdings can be just as high as those of the industrialized kind, which rely on industrial chemistry (and hence on oil). In truth, we don’t have to advocate 100 per cent organic systems, with adherence to all the official rules. But it is sensible – sound biology – worldwide to see organic agriculture as the default position; what is normally done unless there is extremely good reason to do otherwise.

Maximally polycultural, integrated, quasi-organic farms are complex. Therefore they are labour-intensive – and the labour they require is skilled. In such systems there is no great advantage in scale-up, and many disadvantages. So sound, basic biology tells us that if we are really serious about the future – if we really want to feed 9 billion people well without wrecking the rest – then above all we need small mixed, quasi-organic farms. In structure, indeed, they should be traditional. In reality, traditional farms can often work badly, for all kinds of reasons, among which under-investment and lack of markets are outstanding. But they are still the norm worldwide, in a thousand different forms, and if they are properly supported – with ingenious technologies to reduce the back-break; with part-time  employment (of huge importance for many reasons); and with appropriate infrastructure – they really could feed the world. Even as things are, with the cards seriously stacked against them, such small farms still provide around 70 per cent of the world’s food. The much vaunted food industry, which attracts almost all the research funds and occupies so much government time, and gobbles up so much resource, and causes so much of the collateral damage, accounts for a mere 30 per cent of the whole output. But it’s the bit with the most powerful lobbies and the most wealth – precisely because it is expressly intended to make money – and so it’s the bit that the world takes notice of.

But agriculture that is designed primarily to maximize wealth and concentrate power must pursue a quite different logic. Output must be maximized – raise the turnover – and this is done by piling on the inputs, constrained only by their cost (for as the report points out, the collateral damage is commonly externalized). Costs must be reduced – which means that labour must be cut and cut again for labour is the most expensive input; and such labour as there is should be as cheap as possible – not skilled workers but day-labourers (and preferably immigrants or the dispossessed, who have no rights and/or are too desperate to protest).  Without skilled labour the husbandry must be as simple as possible – so bang goes the complex, integrated systems. Instead we have monocultures as far as the eye can see – wheat, maize, palm oil, rapeseed, pigs in million-strong units (fed on soya grown in monocultures in the erstwhile rainforest of Brazil that we are not supposed to be felling) and even dairy cattle, in units of several thousand. The muck from these animals, which once was a vital asset (the main reason for keeping pigs, indeed) now becomes an embarrassment. It may be processed to provide energy and fertilizer in anaerobic digesters but in practice is generally a pollutant — and a million-strong pig unit produces as much ordure as London. Biologically this whole arrangement is grotesque. Socially it is foul. But it is profitable; and in a world where money is all, it is considered “efficient”, and therefore “realistic”, and is where big industry puts its money; and governments that measure their success in GDP, see all this as the way of the future, and smooth the way.

At a common sense and anecdotal level the biological advantages of small mixed farms over vast monocultures is now too obvious to be worth further discussion. But this is the key issue, and discussion is vital. If we go down the polycultural, quasi-organic route we could beyond reasonable doubt feed everybody well. If we continue down the industrialized route we would well be signing our own death warrant, and the rest of the world’s too. So it really matters. Here, truly, we need the data, the evidence, to show which kind of approach could serve us best.

But there is no such discussion in this report. To be fair, the report does mention small farmers: “Smallholder farming has been long neglected. It is not a single solution, but an important component of both hunger and poverty reduction.” But this comment is only in passing: the implication is that of course the future lies with the industrial approach, within the global market.  There is no recognition that small farmers are still (despite everything) the world’s chief providers. More to the point, there is no exploration at all of the key issue – polyculture versus monoculture. It isn’t even suggested (as far as I can see) that further elucidation should be a research priority.

There is a brief passage specifically on organic farming, in a box, which states the case for it very well: that it isn’t just about rules – it’s about farming in accord with sound ecological principles. So, the report concludes, “the wider application of specific practices will make a significant

contribution to integrated and sustainable approaches to food production”. That’s not quite saying that organic farming should be the norm – what farmers should do unless there is very good reason for doing something else – but at least it is far from dismissive. The report also says that “The challenges as outlined here are so great that a flexible response involving all possible options based on the rigorous use of evidence is essential” – and that too seems eminently reasonable. It is the case, however, that although organic research has received some government support this past half century, the vast majority has gone to agro-industry, which nowadays is focused on biotech; and it seems most unlikely that we can expect any change of direction any time soon.

In similar vein the report tells us that the food chain as a whole produces around 30 per cent of all the anthropogenic greenhouse gases, with agriculture accounting for a little more than half that, and “The single most important contribution of agriculture to greenhouse gas emissions is through the production and application of nitrogen fertilizers”. So it makes sense, does it not, to explore the kind of farming that doesn’t use nitrogen fertilizers? But this is not recommended. Furthermore – a very surprising figure! – “There is nearly as much carbon in the organic compounds contained in the top 30 cm of soil as there is in the entire atmosphere and a vast amount of carbon is tied up in land used for food production”. Indeed, “were the organic carbon pools in the world’s soils to be increased by 10% in the 21st century, it would be the equivalent of reducing atmospheric CO2 by 100 parts per million”. This is astounding. If it means what it seems to mean then this – increasing soil carbon – would surely make the greatest of all possible contributions to reducing global warming. In this context too, organic farming could and surely be the key player. But the report is silent on this point. Instead it recommends that we look urgently at biotech and nanotechnology, and take pains to ensure that “the public” understands why these are so necessary. But I will come to that.

More on carbon, first of all. For, the report tells us, “the second most significant is from livestock production through enteric fermentation and manure”; and “ruminants produce significant amounts of methane when compared with monogastrics”. Ruminants also produce more GHGs when grazing than they do when fed on grain. So, we are told. GHGs from livestock might be reduced by “breeding for reduced GHG emissions in beef and dairy cattle and via genetic improvements in their fodder, and the provision of high starch concentrates to reduce the production of methane in ruminants”. That is: feed wheat to cattle rather than grass – and more high tech.

There are various problems here. First, the report itself points out that change of land-use – which largely means ploughing – is a major source of GHGs; so the less we plough, the better. But arable, above all, requires ploughing, every year, while permanent pasture does not. Furthermore, the extra fertilizer needed to feed wheat to feed to cattle instead of grass is, is as the report itself tells us, the prime agricultural source of GHGs because oil is needed to produce artificial N. In short: a life-time analysis might well show that cattle fed on permanent grass produce far less GHG than those raised on wheat. There is also evidence – it needs fleshing but the roots of it are there – that well-managed grassland sequesters more carbon than it emits, despite the best efforts of belching cattle. The report stresses solutions to the world’s food problems “will require decision-making that is fully integrated across a diverse range of policy areas which are all too often considered in isolation”. Yet the report itself is far from free of internal contradictions.

But although the report seems to have far too little to tell us about the possible biological solutions to the world’s food problems, it does have a great deal to say about the new technologies.

High tech?

We can all agree in principle that “New technologies (such as the genetic modification of living organisms and the use of cloned livestock and nanotechnology) should not be excluded a priori …” Indeed, nothing should be dismissed a priori. We might, however, object to the way that sentence finishes – “ … excluded a priori on ethical or moral grounds”. Actually, to dismiss a particular technology “on moral grounds” is not to dismiss it a priori. It is to dismiss it on moral grounds. Moral grounds can be a very good reason – and indeed may be the best reason – for rejecting particular courses of action. This is why civilized societies reject torture as a way of exacting information. The report seems to be suggesting that moral grounds simply should not be allowed to get in the way of what the editors apparently regard as serious science – which, one might reasonably feel, is a somewhat chilling suggestion. But, the report tells us, “ … there is a need to respect the views of people who take a contrary view”. No doubt those who do object to the new technologies, on whatever grounds, can be catered for by a niche market.

But the need for “the genetic modification of living organisms and the use of cloned livestock and nanotechnology” is, apparently, accepted a priori, because “Investment in research on modern technologies is essential in light of the magnitude of the challenges for food security in the coming decades”. No discussion. Fact. In truth, as many people including many excellent scientists and agriculturalists have been pointing out for the past several decades, GM crops have so far solved no problems that really need solving – or where they have, they in no case offer the best solution. (The one possible exception is virus-resistant papaya. But virus resistant papaya, important though it is in context – it could be a prime source of vitamin A, making nonsense of Syngenta’s much-vaunted GM golden rice – does not justify the all-embracing hype around GM, any more than the non-stick frying pan justified NASA’s space programme, as NASA used to like to tell us). In truth, GM crops are introduced at the expense the traditional agriculture which produces most of the world’s food and almost certainly could, with a helping hand, solve our problems. Again, evidence seems to have gone missing. As for nanotechnology – where did that come from? As for cloning – there seems to be some clash here with the report’s own concern for “The preservation of multiple varieties, land races, rare breeds and closely related wild relatives of domesticated species. This is very important in maintaining a genetic bank of variation that can be used in the selection of novel traits”. Cloning obviously implies the complete opposite. (Indeed, the idea of GM bananas has been promoted as a way of increasing their genetic diversity, precisely because cultivated bananas are sterile and so have to be cloned by cuttings. Hmm).

Where do we go from here ?

One good thing has come out of this report. It confirms the need for the Campaign for Real Farming – for we really do need  “A people’s takeover of the world’s food supply”. We cannot rely on the powers-that-be to deliver. It confirms the need, too, for the College for Enlightened Agriculture – for the world’s experts and intellectuals, or at least the present assemblage, are simply  not addressing the right questions.



Why can’t people be more like horses?

We, human beings, in general arrange our affairs like ants: a boss at the top, and everyone else in line. To be sure, in Britain we nurture the illusion of democracy – no despots here, thank you very much! – for we have an elected government – our own representatives. But no-one who actually lives in Britain and isn’t simply a bang-drumming chauvinist can seriously believe that this is how things work. A five-yearly choice between party machines tailored for power and funded by corporates and other self-seekers surely falls short of what democracy ought to mean.

Once in power, modern British governments do not actually govern in any worthwhile sense of the word for they have lost control of the economy, which is the thing above all that governments really ought to be managing, on behalf of the citizens. But they do interfere with our lives at every turn, both on the smallest scale (what children should be taught in schools, for example), and on the grandest. For despite their economic haplessness their political power seems absolute: to take us into wars that most of us feel very badly about; to put civilization on hold while they pay back the debts that they allowed the banks to run up; to sell off or to close down most of Britain’s industry and — to return to the main theme of this website – to strike British agriculture a series of blows over the past four decades that have almost proved mortal, all in the cause of ideological and economic dogma.

So the government isn’t officially despotic, and we do have some freedoms let. I’m allowed to write this, for example, without serious fear of beating up, which wouldn’t be the case in Zimbabwe. But then, brutality in general reflects a lack of confidence. Why beat people up when you feel secure?

Many have argued, though – including the late and much-missed Brian Goodwin, lately of Schumacher College, Devon – that we could learn a great deal that’s of relevance to our lives – including the economy, and the manner of governance – from studying nature. Our own organization in general is ant-like – definitely a top-down hierarchy – yet we, at least ostensibly, are the least ant-like creatures of all. Ants need a top-town hierarchy because they are insects, with a simple (relatively speaking) nervous system. Their behaviour including their social life may be highly complex, and often it seems miraculously so – and yet it seems to be programmed, as a computer is programmed. Their behaviour may to some extent be flexible, but such flexibility is easy to simulate in a computer model – just a few either-or rules seem to explain it all (although we should surely stress the “seem”). But human flexibility is of a different order. In principle (we like to think) we can re-assess each new situation from many different angles, and consult in detail with each other, and then make genuine choices. We don’t simply choose, as a computer does and an ant seems to do, from a very short menu of pre-set possibilities.

For we, of course, are not insects. We are mammals – with brains of a very different architecture which in general is far more “organic” (in the sense that architects use the term) and far more versatile.

So why, in our social organisation, in our approach to governance, don’t we behave more like our fellow mammals? For although some mammals seem to have an ant-like hierarchy (naked mole-rats are the most famous example) others (I reckon most) decidedly do not. We may think (because that’s the way we do think) that our fellow primates are organized as we often are – as simple patriarchies, with an alpha male in charge. But look more carefully and in most mammalian species, primate and otherwise, you find that the boss male rules only up to a point, and only by consensus. The patriarchies commonly turn out to be matriarchies and/or sororities and/or female gangs, with the males allowed in only for the specific purposes of reproduction (as in elephants) and if they are suffered to strut their stuff as resident alpha-males it’s only by consensus.

For example, no alpha male was ever more conspicuously alpha than a big male gorilla. The alpha male is the group protector and also the judge and jury, administering justice, dishing out punishment. Each is a little Solomon in his own kingdom.  But if the boss male punishes the wrong subordinate for the wrong reasons the older females give him a seriously bad time. They don’t attack him physically (he is twice as big as they are, after all) but they do send him to the rainforest equivalent of Coventry, until he mends his ways. Similarly, at the Woolly Monkey Sanctuary in Looe, Cornwall, Rachel Hevesi tells the tale of an aggressive alpha male monkey who was completely ignored after he had shown impatience with one of the infants – to be replaced by a smaller but more balanced individual. This is an anecdote, and some scientists shun anecdotes. But nature is very hard to observe and without one-off anecdotes we would know very little about it at all. Rare events are often the most important – the pivotal points, that change everything – but because they are rare they are almost bound to remain anecdotal precisely because they are observed too rarely to quantify, and cannot be fed into the statistical mincing machine.

Male lions, it seems, have a somewhat ambiguous status. The pride, as is usual, is a matriarchy and a sorority, ostensibly with a male (or pair of males, and occasionally three) “in charge”. The males — commonly pairs of brothers — take control in the first place by booting out the previous incumbents. Once they have muscled their way in they stay muscled in – paying their way presumably by providing some protection, notably against hyaenas – until they are seen off by the next wave of males, generally after a couple or years or less. Lions of course feature more highly in human mythologies than Woolly Monkeys do, and human societies have often been more lion-like than primate-like. The lions in London’s Trafalgar Square sculpted by Sir Edwin Landseer in 1867 look for all the world like Victorian statesmen, for whom they apparently served as a role model.

Now, I have just been told of a subtle variation in social organization not among high-flown primates but in horses, whose reputation for subtlety is somewhat ambivalent. The reference comes from Equest Partnership based in Buckinghamshire (www.equestpartnership.com) which seeks to improve human behaviour – anything from team leadership to stress management (Journal of Stress Management) – by arranging contact with horses. Horses, I am reliably told (by Equest’s Harriet Worthington) have a wonderful intuitiveness. They don’t cerebrate like human moral philosophers or psycho-therapists but they do, I am told, empathise; and people who form relationships with them really do benefit. This ability to empathise is surely a function of their herd behaviour – they need to anticipate each others’ behaviour and they need above all to trust each other.

But although horses are very definitely herd animals, they do not have a fixed social hierarchy. They form ad hoc groups for different purposes, with different leaders as appropriate (if indeed a leader is needed at all). Hence they enjoy the many advantages of herd behaviour, but retain their flexibility.

In truth, the social hierarchy of horses, as described by Equest, seems essentially like that of the neural net: the way the nervous system works – not by obeying some in-built, despotic homunculus perched Mekon-like on top of the brain but simply by re-grouping as the occasion demands. The human body is surely the most supremely organised structure in the known universe and if the ever-reorganizing neural net works for our complicated selves with our trillions of potentially independent cells it surely could be made to work for our societies.

I see the Campaign for Real Farming, and other such endeavours, as an exercise in neural networking; not just an ad hoc organisation convened for an ad hoc purpose, but part of a new form of governance. The present government and New Labour would surely argue that it is already has this in mind – that this is what is meant by devolution. Yet the very fact that the government is making the decision on how much power it cares to devolve, and how and to whom, and would surely pull the power back to the centre as and when it suits, shows how very far we are from the required end-point, and from the mind-set needed to bring it about.

But it seems that in principle, horses have already got there.

An apology and a cautious promise

Assiduous followers of and contributors to the Campaign for Real Farming website will have spotted a certain becalmment this past year. Alas, ‘tis so. Yet we have not lost interest, and neither have we been asleep. We have been up-grading – from blog to interactive website, one-day perhaps with bells and whistles: truly able to serve as a global clearing house of good ideas, and as a meeting place for the people who have those ideas and are doing good things or, indeed, for those who simply give a damn. Truly, when the Campaign website is properly up and running, it should emerge as the College for Enlightened Agriculture – a college in virtual form to be sure, but nonetheless able to punch above its weight; and with the potential to become much more. (There will be an article explaining all this under “Big Ideas” just as soon as I can get my act together).

You can see that we’re not there yet. Not quite. In what you see now there are obvious glitches. To make a proper website, it turns out, takes an enormous amount of time and technical know-how which I don’t have at all and Ruth has only up to a point. But with the help of some truly able techies and some generous grants we are getting there – and hope to be properly functional if still embryonic by the start of 2011. By the end of 2011, we should with luck have turned into perfectly respectable toddlers and after that, who knows?

Colin Tudge

November 18 2010