Cultivation and Culture

Separation from Nature, and the technological program to control the world, did not originate with agriculture, despite the eloquent arguments of Daniel Quinn and others who associate the expulsion from the Garden of Eden into the world of toil with the transition from a hunter-gatherer to an agricultural mode of existence. Agriculture, rather, marked an epochal acceleration of a pre-established trend, an inevitable expression of a long-gathering latency.

With agriculture, the separate human realm expanded into radically new territory to include the various animals, plants, and other parts of nature that we made ours. No longer was domesticity limited to the campfire circle. With agriculture, we began to domesticate the whole world.

Because it was agriculture that launched the ascent of humanity into its present phase, the question of how and why agriculture began is critical. Many of the theories in the literature are unconvincing. The fallacy of the “nasty, brutish, and short” assumption of anxiety theory casts doubt on theories depending on population pressure and food shortage (food production can sustain far more people per square mile than food gathering). Hunter-gatherers had the means to regulate their population levels and in many places did so successfully for thousands of years; population grew dramatically as a result of agriculture more than as a cause. Another theory is that climate change or increased CO2 levels at the end of the Ice Age rendered old lifestyles untenable and made new plants available.[69] However, this ignores the rapid adaptability of hunter-gatherers, who inhabited a wide variety of ecosystems even before the end of the Ice Age, and in any event, the transition from one to another environment would seem much less difficult than the transition from foraging to farming. Most ridiculous are explanations that imply we only figured out the idea of planting seeds quite recently: foraging cultures have a highly sophisticated understanding of plant reproduction and the conditions for their growth.

The fact that agriculture arose independently in several locations around the globe points to a natural progression from earlier technology and mindsets, and not an accident (whether mistake or glorious invention) that could just as well not have happened. Agriculture arose independently in Mesopotamia, in China (possibly in two locations), in South America, in Central America, in the Eastern United States, and perhaps in New Guinea and sub-Saharan Africa.[70] Indeed, most of the places where agriculture failed to develop were where there was a dearth of easily domesticable plants and animals. With few exceptions, wherever we could develop agriculture, we eventually did. Somehow, an agricultural future was built in to who we were in the late Upper Paleolithic.

Inevitable or not, agriculture was not a sudden invention but the cumulative consequence of a series of incremental developments that marked a gradual shift in human attitudes toward nature. Although in our customary dualistic mindset we may be tempted to see agriculture as an invention, a distinct epochal transition, Jared Diamond plausibly describes the origin of agriculture as a gradual step-by-step transition from the hunter-gatherer lifestyle. At first, perhaps, nomadic hunter-gatherers merely followed the herds of the ancestors of modern cattle, sheep, and so on. Over many generations, these nascent herders began to provide food and protection at key moments, upon which the animals came increasingly to depend. Crop planting may have started as a wider scattering of seed or removal of competing plants to give favorite foods a head start, followed perhaps by months of nomadic foraging for wild foods. Eventually, these plants came also to depend on the assistance of the planters, whether through deliberate breeding or unconscious coevolution. In any event, the domestic corn plant cannot reproduce without human assistance; nor does a domestic chicken stand much chance of survival in the wild.

Once domestication began, the much larger population density it permitted meant there was no going back. Agriculture, the archetype of human control over nature, induces dependency and the need for ever-increasing control—over land, people, plants and animals—as the population continues to grow.

Along with the gradual shift to agriculture came a transformation in human attitudes toward nature. Hunting accords with a view of other animals as equals. After all, nature works that way—some eat and some are eaten—and the human hunter is doing nothing different from animal hunters. Domestication imposes a hierarchy onto the interspecies relationship, as man becomes lord and master of the animals. Understandably, this relationship is then projected onto the whole of nature, which becomes in its entirety the object of domestication and control. Yet we must also consider that the innovation of animal domestication could perhaps not have happened in the first place unless nature were first objectified conceptually. The solution to this chicken-and-egg problem lies in the embryonic self-other separation embodied in all life forms, going back to prehuman times. Domestication merely represents its crystallization into a new phase: a slow-motion gestalt which also included all the other elements of separation detailed in the foregoing sections.

The farmer’s new relationship with nature engendered a new conception of the divine. As agriculture and other technology removed humans from nature, so also did the gods become supernatural rather than natural beings. The process was a gradual one, starting with ancient pantheons closely identified with natural forces. Gradually, identity evolved into rulership as the gods were abstracted out of nature, eventually resulting in the Newtonian watchmaker God completely separate from the earthly (the natural) realm. At the same time, as we lost touch with nature’s harmonies and cycles, the gods took on the capricious character exemplified by the Greek pantheon and the Old Testament. Accordingly, the gods must be propitiated, kept happy through the offering of sacrifices, a practice found in most ancient farming and herding cultures but not among hunters.

The angry God that arose in early civilizations is also linked to the concept of good and evil and the concept of sin. The corn is good, the weeds are bad. The bees are good, the locusts bad. The sheep are good, the wolves bad. Technology overcomes nature by promoting the good and controlling the bad. As for nature, so also for human nature. The self is divided into two parts, a good part and a bad part, the latter of which we overcome with the controlling technologies of culture.

Whereas hunter-gatherers could easily adapt to all the vicissitudes of the local climate, farmers were at the mercy of drought, hail, locusts, and other threats to a successful harvest. While the resources of hunter-gatherers were virtually unlimited and their population fairly stable, agricultural civilizations experienced famines, epidemics, and wars that decimated whole populations and defied any attempt at prevention. Here was a source of constant, inescapable anxiety woven into the fabric of life itself—no matter how successful this year’s harvest, what of next year?—as well as a motivation for the increased understanding and control represented, respectively, in science and technology. Scarcity and the threat of scarcity is implicit in the attempted mastery of nature. Jockeying for position in the face of scarcity, we endure an endlessly intensifying competitiveness that is built into our system of money, our understanding of biology, and our assumptions about human nature.

Paradoxically, while agriculture raised nature’s productivity of food (for humans), it also introduced the contemporary concept of labor. Food was at once more abundant but also harder to get. With agriculture we had to work today to obtain food tomorrow—a primary example of the paradox of technology, which has brought us to the brink of catastrophe despite its motivating goals of ease, comfort, and security.

Agriculture, because it involves keeping nature from its rest state, necessarily involves effort. I don’t need to do any work to grow thistles, burdock, and crack grass in my garden, because that is what those thousand square feet of land naturally tend to. But to grow cabbages, kale, and garlic I have to do all kinds of work—pulling up the plants that crowd them out, erecting fences to keep out the rabbits and woodchucks, etc. The truism that we reap only what we sow only goes back as far as agriculture. Before then, we could reap without sowing: nature was fundamentally provident. For the hunter-gatherer, the providence of nature requires little labor or planning, but only an understanding of nature’s patterns. Primitive survival is a matter of intimacy and not control.

The advent of agriculture accelerated the demise of the gift mentality that characterizes hunter-gatherer societies. Whereas hunter-gatherers see game and food plants as gifts of the earth, a farmer tends to see them as items of exchange for labor, and it is always his goal to tilt that exchange constantly to his benefit. No longer is sustenance something that the world freely provides. Whereas the hunter-gatherer is part of the gift network that we call an ecology, the farmer separates himself from that network and seeks to extract what he needs from it. Thus Daniel Quinn names the hunter-gatherers “leavers” and the agricultural societies “takers”,[71] though perhaps “givers” would be a better word for the former. Eventually, the new relationship of taking and exchanging manifested among humans as well, setting the stage for the rise of money and property.

When we need to apply effort to coax a livelihood from the land, humanity’s relationship with nature tends to become adversarial. The land naturally drifts towards weeds, pests, and in general a less productive default state. With the technologies of agriculture we seek to prevent this from happening. The battle lines are drawn. Today we seek to live more sustainably, yet the oppositional view of nature that environmentalists lament as the most destructive force on our planet is built into the very origin of civilization—agriculture. What else can we expect from a technology founded on arresting or reversing the processes of nature? Consequently, the end of humanity’s war against nature must involve a wholly different approach to technology, and not merely better planning, fewer accidents, more foresight, and tighter control.

Agriculture inaugurated our conception of the earth as a resource or asset, defined primarily by its productivity. The land gradually lost its intrinsic value—its sacredness—and assumed an extrinsic, conditional value based on what it could produce. For the first time there was good land and bad land. The transition was a slow one, advancing each time new technology separated us a step further from original natural cycles. The primitive farmer is still very close to the land, even if less completely embosomed by it than the hunter-gatherer. Each new technological advance freed us from one or another natural limitation, culminating in modern industrial monoculture in which given the right inputs, almost anything can be made to grow on any land. Even a desert can be made to bloom.

It is becoming increasingly apparent, though, that natural cycles can be ignored only temporarily. Their disruption bears consequences that can be postponed by a series of technical fixes, but never permanently denied. Deserts can be made to bloom, yes, but only at an increasing cost and not forever. Someday they will return to their natural state. Moreover, the consequences of the disruption of natural cycles intensify the longer it is sustained. The present accelerating desertification of the world’s agricultural areas testifies to the impossibility of forestalling natural processes forever, and to the severity of the consequences of trying. It is as if the desert cannot be denied.

The valuing of land according to its productive function and not its innate sacredness projects onto human society in the division of labor. With agriculture, human beings began to be distinguished by function—farmer, soldier, metalsmith, builder, priest, king—in a way they never had before. True, there are chiefs and shamans in pre-agricultural societies, but with rare exceptions they were not exempt from hunting and gathering food. They did not specialize by trading food for their services. In agricultural societies people came to be defined more and more according to generic functional classifications, a trend that drew as well from the anonymizing effect of the vast increases in population density that agriculture permitted.

With agriculture, a new category of being came into existence: the stranger. Before then, humans lived in tribes of at most 500 people, comprising bands of about 15-20 people each. It is not difficult to know 500 people by name and face, especially after a lifetime of frequent association, but beyond that the identifying structures of kith and kin become tenuous and some people necessarily fall into the category of “other”.

In a hunter-gatherer band or tribe, and even in a Neolithic village, we were intimately known by virtually everyone we ever interacted with. Our acquaintances collectively embodied a tightly integrated web of relationships from which we derived our identity, our sense of self. We answered the question, “Who am I” through relationships with people who knew us very well, as unique individuals. But as the scale of society expanded, these personal relationships gave way to generic ones governed by commerce, law, and religion. Accordingly, the sense of self came to depend on these structures as well, which are by their nature anonymous and impersonal.

Relationships in primitive societies are guided by kin structures that provide each person a place relative to each other person. When society expands in scale to the point where two people are strangers, unable to place each other in their respective constellations of self, then there is a serious potential for conflict. Some kind of impersonal governance is required in the absence of structures of known relationship. After all, when someone is not “self” then he is a potential competitor whose interests might be at odds with ours. Practically speaking, if someone is a stranger there is no rational reason not to cheat them. Since he is not linked to your own social network, the consequences need never come back to haunt you. Hence the need for some kind of regulatory structure imposed from above.

When hunter-gatherers from different bands run into each other in the bush, they immediately begin an urgent and often very long conversation about who they know in each other’s band, seeking to identify their relationship. Eventually, they establish that one is the cousin of the sister-in-law of the nephew of the other one’s brother-in-law, effectively bringing each other into the same constellation of self. Vestiges of this behavior are apparent today when two strangers talk: “You were in Taiwan? Hey, I had a classmate from Taiwan—do you know so-and-so?”

When encounters between strangers are common, then some kind of governance is necessary based not on their unique relationship as individuals, but on generic principles: “All are equal under the law.” Laws in the form of explicit codes are never found in pre-civilized peoples, nor are they necessary. It is no accident that as modern society grows increasingly anonymous, and as we pay strangers to perform more and more life functions, that the reach of the law extends further and further into every corner of life. Disputes that were settled informally a generation ago are today routinely administered according to written rules. Indeed, without some kind of formal standard we would feel insecure, for we would literally be at the mercy of strangers. This trend is a necessary consequence of the alienation and depersonalization that began with agriculture.

The division of labor introduced a new kind of anxiety into human life rooted in the idea that “You have to work in order to survive,” a concept apparent today in the locution “to make a living.” Foraging peoples engaged in various arts and crafts beyond what was necessary for survival—the fashioning of musical instruments, for example—and there was surely some differentiation of skills and talents among them. However, food was always there for the taking, readily available regardless of prior planning or its lack. We typically applaud the agricultural surpluses that “freed the non-farmer to specialize in other skills,” not realizing that the non-farmer, lacking the means to obtain food on his own, was thereby enslaved to his specialization. Freedom is slavery. Art became profession and its product became commodity. Work is nothing other than art diminished, degraded, and debased. Driven by economic necessity (a code phrase for survival), no longer could we work in “fits and starts, and in these occasional efforts . . . develop considerable energy for a certain time.”[72] Even worse, art created in the interests of economics is no longer art, for good enough is good enough. Why make it any better when it is to be exchanged anonymously for food or money, when it is to be given over unto the Other?

In the early days such exchanges were not completely anonymous. Money only partially replaced other forms of reciprocity, and most human interaction was not with strangers. Moreover, the life of the subsistence farmer is still intimately involved in the cycles of nature, wedded to the soil, and sustained only through a knowledge and respect for natural laws rivaling a hunter-gatherer’s. Indeed, for us moderns gardening takes us much closer to nature, not away from it. However, once it started the ascent of agriculture built upon itself: technology advanced, the population grew, the regime of control intensified, and the conceptual dichotomy of human and nature widened. Finally, the institution of the Machine that culminated in the Industrial Revolution completed the degradation of work from its original identity with art to its present condition of slavery.

 

[69] See for example Sage, R.F., 1995, “Was low atmospheric CO2 During the Pleistocene a Limiting Factor for the Origin of Agriculture?” Global Change Biology, 1, 93-106

 

[70] Diamond, Guns, Germs, and Steel, p. 99. The inevitability of agriculture (that it developed every place where there were domesticable species) is a main theme in Diamond’s book.

 

[71] Quinn, Daniel. Ishmael. Bantam Books, 1995.

 

[72] Gusinde, Martin. The Yamana, Human Relations Area Files, 1961. p. 27, cited by Sahlins, p. 28