By Adam Garfinkle
For nations to be successful in the 21st century, three assets are key: human capital; social trust; and institutional coherence.
Human capital translates broadly into education, but education of a certain kind—education designed to nurture critical thinking and creativity.
Social trust is about engendering mutual expectations of reciprocity in day-to-day interactions, producing the kind of underlying social stability that protects individual initiative, enables rule of law, and underpins the sanctity of contracts.
Institutional coherence refers not just to rule by accepted code instead of rule by personal whim within given institutions (schools, the army, universities, and so on), but the way institutions dovetail, cooperate, and support one another in a society. (To take a famous example, the Manhattan Project worked as well as it did because government, academia, and business in the United States were able to coordinate their efforts in a manner that enabled the outcome to exceed than the sum of its parts.)
How does a traditional or transitional nation achieve or maximize these three key assets in a 21st century setting? This essay will answer this question, but it is first necessary to lay down some preliminary definitions and observations about the subject matter in general.
Preliminary Issues
For lack of a better way to say it, development projects in the world today, and as it has been since the post-colonial period began in earnest about a decade after World War II, amount to efforts to become “modern.” To a great extent, this vocabulary derives from the fact that the scholarly and policy-oriented literature on this subject has been overwhelmingly produced in the Western countries, and that the first generation of independence leaders was educated either in the West—Britain, France, the United States—or in the Soviet Union, whose depiction of modernity was a pointedly ideologized and distorted, yet still mainly Western, creature.
That said, the word “modern” is typically used in a lazy fashion to mean simply that which is contemporary, the assumption being that, at least when it comes to generating economic growth and the national power it sustains, what is contemporary is richer and more powerful than what came before a century or two ago. This usage is understandable, but it is problematic for several reasons. One of those reasons is particularly acute.
Since the beginning of the postwar Western cottage industry of modernization studies, a debate about the sequencing and relationships of political and economic modernization has developed. It has displayed the character of a standard chicken-and-egg conundrum: Does economic development need to precede political modernization, or does political modernization need to precede sustainable economic development? Or can they be stimulated together?
Samuel Huntington settled the matter intellectually in favor of the primacy of political institutions.[1] Except that he didn’t settle it in terms of praxis, because even in the 1960s everyone knew that political modernization was hard and slow while economic development—as amazing as it sounds to say it today—was thought to be much simpler to achieve, and to achieve quickly. The development experts at the U.S. Agency for International Development, the World Bank, and prestigious non-governmental institutions (for prime examples, the Ford and Rockefeller Foundations) conceived the route to economic development as a technical exercise in which the more uniform the inputs and the higher their quality, the greater would be the desired and predictable outputs. An appropriate metaphor is that of a vending machine: Different machines might offer different products, but they were all still vending machines, and one could build and repair all of them basically the same way.
The literature of that day is instructive. An ur-source for this way of thinking is the work of a Kennedy Administration “best and brightest”: Walt Whitman Rostow. His work, The Stages of Economic Growth (1960), posited a method to achieve “take-off” that applied to all countries no matter their divergent histories, cultures, or factor-endowment blessings. It was based on abstract economic theory and helped form the thinking behind the 1961 birth of the U.S. Agency for International Development—USAID—when Rostow joined the Kennedy Administration as Deputy National Security Advisor.
Any economy could grow, no matter its specific political arrangements, argued Rostow, so long as those arrangements allowed for certain policy techniques to be applied: import substitution; technical education; and especially technology transfer. The idea was that if you took a machine that worked wonders for productivity in a Western economy and shipped it to, say, India or Sierra Leone, it would—provided there were people who knew how to operate and maintain the machine and a market for what it produced—be as productive there as it was where it was invented. The main presumption here was the universal applicability of technique based on abstract theory, and the concomitant irrelevance of what some academics called culture. An added assumption, true to the developing behemoth of macroeconomic orthodoxy, was that all human beings are interchangeable and can be presumed to be rational actors set on maximizing value.
Of course, this theory did not work very well, as a few observers such as Peter Bauer and later Albert O. Hirschman warned it would not. Before long, some economists began to understand some of the reasons the theory did not work as predicted. Technology embodied the capital/labor ratio of the society that produced it, so if you simply shipped a machine to a place with a different capital/labor ratio, and other different factor endowments, it would not produce comparable net economic results. The result of technology transfer to India, mainly for agricultural application, proved a case in point. Machinery developed in a place with an abundance of capital relative to labor sent to a place where capital was scarce but labor plentiful produced crops only at the cost of driving huge numbers of people off the land before a nascent industrial economy could assimilate them. The result was sprawling slums like those of Calcutta.
Culture mattered too, it turned out. So, for example, in one West African country a USAID project designed to bring water up the hill from a river to a village using electrical pumps failed not so much because reliable supplies of electricity could not be maintained, but because the pumps displaced the lines of women who traditionally passed the water, bucket by bucket, up the hill each morning. That line of women served an important social communications function for the village and without it things went a little haywire. So the villagers together took the pumps out of service, turned them upside down and planted flowers in them. When USAID personnel returned to review the project’s progress, they were predictably nonplussed. These were not “ugly” Americans, merely clueless ones of the generic variety who had yet to understand that one size does not fit all when it comes to the kaleidoscopic intersection of technology, social structure, and culture.
They simply did not understand that what used to properly be called the discipline of political economy did not obey universal rules. They did not recognize that the definition of “modern” traces particularist Western historical patterns, just as technology bears the imprint of the society that produced it, and hence takes its meaning from those patterns.
The Western way to modernity over many centuries involved three key developments: the rise and ascendancy of individual over communal agency; the development of secular norms not only in politics but also in the relationship of the arts to religious authority; and the rise of a this-worldly linear teleology of progress in place of the cyclical chain-of-being conception of medieval times. This formula may or may not be applicable to non-Western nations, and even if applicable it may not be possible to achieve modernity in the same way that Western societies achieved it in the past. In other words, non-Western societies can become more affluent and their governments more powerful in interstate relations, but in ways that do not trace the developmental patterns of Western countries and hence are not, strictly speaking, “modern.”
We know it is possible because some non-Western societies have become as prosperous and powerful as Western states. The first to show this was Meiji-era Japan. Nations on the cultural fringe of Europe have done so, too—Finland is a good example. There are all sorts of other cases that resemble hybrid or partial examples. India is becoming modern, but that owes a great deal to the unusually long tenure of the British Raj, without which it is unlikely that India today would be a unitary state at all. Singapore and Israel also represent hybrids of a sort, but both are of city-state scale, and that matters. Israel is both European and non-European in character, its core culture antedating the presence of Jews in Europe, its motive and “modern” raison d’être being not to spread but to escape the European predicament, and its demography after 1949 growing increasingly non-European as well. Singapore is a singular case owing to, again, its long British tutelage, its ethnic heterogeneity, and its enlightened but autocratic Confucian style of leadership.
Of course, still other countries have managed to become rich without being wealthy in a true sense, or socially and politically “modern” in the specific sense in which we have here defined it. These are mainly countries afflicted by the “oil curse” or some other resource curse to one degree or another: some Arab countries but also others as different as Ecuador, Mexico, and Venezuela in Latin America, Indonesia and Brunei in Asia.
The point is that there is no one-size-fits-all answer to achieving affluence and power because different nations have different histories and cultures as well as different factor-endowments. As a result, contrary to the early modernization literature in the West, there is no single path to political and economic “modernization,” and hence important questions of priority, sequencing, and speed in development initiatives cannot readily be generalized: For any practical purpose, each case will have its own inner logic.
Yet despite this experience over the past seventy years, and the clarity of its key lessons, it is still often the case that “modernity,” lazily used and not precisely defined, is conceived overwhelmingly in terms of economics, not political culture. Just as Western elites and publics all too often bow to the false god of Gross National Product, so do many aspiring modernizers outside the West. This is a category error of major scale. It is lately tied to an obsession with the false precision and utility of quantitative facts, and reminds one of Nietzsche’s warning that, “Were it not for the constant counterfeiting of the world by means of numbers, men could not live.”
A story may help illustrate the point. There was nothing especially current or American about the often-unstated assumption in the postwar modernization literature that there was but one path to modernization out of traditional society. Proof reposes in Würzburg, Franconia, in the 1744 Residenz of the Schönborn prince-bishops.
In the Residenz there is a magnificent fresco ceiling painting, completed in 1752 by Giovanni Battista Tiepolo, called “Allegory of the Planets and Continents.” In the painting all the cultures of the earth are looking toward—and seem to be making their way to—the epicenter of culture and refinement, which just happens to be in Würzburg. The cultures are ranked from primitive to refined by their distance from the epicenter, such that Teipolo’s rendition of native Americans, for example, is quite far down the wall. But the assumption painted into the whole is that they, like every other culture, are on the road to Franconia, and one day, if they keep the ideal of the Schönborn prince-bishops in mind, they too will get there—allegorically speaking, of course. Everyone would be Catholic, yes; but in the material splendor of the Residenz one could be forgiven for presuming that an afterthought. These were worldly men and women.
The point is not just about Western conceit or materialism, but about the presumably singular road to this-worldly refinement. That same presumption became formalized in the 19th century under the Three Age System theory, from whence it passed into the 20th-century modernization literature—a good example, focused on the Middle East, being Daniel Lerner’s influential 1958 book The Passing of Traditional Society.
It was assumed, too, that as modernization proceeded, superstition—also known to the divines of Western secular modernity as traditional forms of religion—would melt away before the power of scientific rationality, its attendant depredations and excuses for inequality and exploitation with it. Religious institutions would become privatized as they were in the West, and benign secularity would reign supreme. Insofar as it persisted, religion would become more “sophisticated,” meaning that it would be turned essentially into a form of communal therapy in which the old theology became a respected if sometimes embarrassing vestige of times passed.
The main problem with modernization theory (and its corollary about religion) was that reality did not keep pace. As already suggested, it turned out that modernization was neither ineluctable nor, where it seemed to be occurring, uniform. It turned out that achieving the Weberian characteristics of a modern state—a prerequisite for Western-style democracy and complex market capitalism—came easier to some cultures than to others. Some countries figured it out on their own terms, but much of the world remained within traditional patrimonial social structures. These structures were, and still are, not just different in degree on some stretched out unitary timeline; they are different in kind. Trying to export modern forms of democracy and complex market capitalism[2] to tribal-patrimonial societies is a little like trying to attach a standard bicycle rack to a horse. You can perhaps tie it up there with effort, but it won’t work very well, it won’t stay there for long, and there’s not much sense in wanting to do it in the first place.
In due course, modernization theory’s analysis of religion also fell to pieces. Scholars showed how in many Muslim societies upward mobility (associated with greater literacy, urbanization, education, and labor specialization) correlated with greater piety and stricter levels of observance, not the reverse. This form of “modernist” piety became known in the Western scholarly literature as neo-fundamentalism, which postulated that the sociological location of religious institutions within cultures varies from place to place and, in any given place, can vary over time.
It would therefore be tragicomically strange if non-Western “modernizers” today adopted the “one-path-to-modernization” thesis that has failed so badly over the past century or so. We need to be very clear here, so we must repeat: There is no universal abstract theory of economics that works to create material affluence in all societies.
Indeed, the key variable in all cases is not economic at all in the narrow sense of that term, but rather the “fit” between cultural norms and the internal and external resources available to a nation’s elite. English words like “progress,” “corruption,” “equality,” “accountability,” “fairness,” and many others possess meanings bound to culture, just as “modern” is, as already explained, bound to the cultural zone that created that vocabulary concept. All translations of these and other terms into non-Western languages are at best approximations shot through the prisms of particular cultures. That is why in this study document all uses of the terms “modern” or “modernity” below are put in scare quotes.
Of course, cultures do change. Usually they change slowly, but sometimes they can change more quickly if certain circumstances are present. Very young mean populations exposed to pluralist experiences because of new technologies compose one such circumstance. Enlightened leadership to direct the opportunity for change is another. Nevertheless, cultures are webs of dialectics between attitudes and institutions, so to change any one major aspect of how a culture works is usually to change many others without knowing exactly how the parts fit together, and so begin a process whose outcomes are very difficult to predict.
For that reason, the definition of what a “modernizing” leadership wishes to achieve is critical. Very few non-Western elites bent on “modernization” want to ape Western characteristics on the whole. They usually want to pick and choose, and to be sure, one can distinguish modernization from Westernization. For example, “modernization” out of a traditional society nearly always demands a new conception of time, usually a sensory appreciation of smaller units of time and what can be done with them in economic processes that demand higher degrees of coordination among individuals. But that doesn’t mean that non-Western societies have to adopt the Western calendar’s month names or count years according to the Christian era. Similarly, traditional dress may not be appropriate for some kinds of industrial work environments, but that doesn’t mean that factory workers must dress like Western laborers to achieve the necessary safety and efficiency standards necessary to the task.
Still, to drive home the point, it is very hard to pick and choose: Importing techniques from outside one’s culture without importing certain attitudes with them has proven very problematic; and inserting those techniques and attitudes into one’s own culture cannot reliably affect some functions without also affecting many others that are difficult, or impossible, to anticipate. Again, cultures are interwoven webs—distributed systems that marry conceptual and utilitarian aspects of social life. They cannot be precisely sliced and diced, and no applicability formulae worth a damn can be gotten from books on the shelf. It just doesn’t work that way.
All that said, five general observations apply to the “modernization” projects of the 20th and now the 21st century.
First, fully “modern” states and economies are unlikely to arise from patrimonial social structures; some kind of Weberian institutional development establishing formal rule of law must occur. In the Western experience secular rule of law arose over many years out of canon law. In many non-Western cultures, the legal system is a mix of indigenous and imported legal norms. But the point is that there is a critical difference between rule of law and rule by law, which is a halfway house between arbitrary personalistic rule and more systematic codes. Western democracies have rule of law; China has rule by law; many patrimonial societies have arbitrary personalistic rule whatever the formal arrangements say is the case.
Second, economic modernization is unlikely to “stick” in the absence of political modernization; this is why, ultimately, the so-called Beijing alternative to the Western pattern—a modernism based on statist top-down industrial policy and the instrumentalization of individual members of society—is unlikely to be sustainable in the end if it does not bend toward political pluralism. Nations can become rich by aping Western techniques and stealing their creative innovations, but that alone cannot build a real social-cultural infrastructure for world-class wealth and power. It can seem to do so for a time, when a country, like China, is rising from a very low base—and when very deep reservoirs of social trust in society as a result of a society’s cultural homogeneity can compensate for the lack of pluralist politics. But sooner or later the truth will come out, and the limits will become visible.
Third, just as science is not an accumulation of facts but a process, so “modernization” cannot be measured by GNP data, but is rather reflected in the ongoing convergence of attitudes and institutions. To get “modern” results in terms of relative affluence and state power requires ways of thinking and organizing that differ from traditional patrimonial ways of thinking and organizing. “Modernization” is not a thing; it is an ongoing human process. It is not a noun really; it is a verb.
Fourth, pace early Western modernization theory, it is not true, to repeat in different terms, that strong faith communities sustaining religious beliefs are incompatible with “modernity.” On the contrary, certain forms of Protestant Christianity correlate very positively with economic vitality and growth, not only in historical Europe but also, for example, among new Protestant groupings in Latin America. But religious views that insist on a perfect identity of theological and political values are problematic, especially in heterogeneous societies, because they tend to smother critical thinking and undermine social trust.
Fifth, political modernization does not necessarily imply or require electoral democracy; accountability is critical, but there are more ways than one to achieve it. And again, cultural homogeneity can to some degree supply levels of trust that can compensate for the relative absence of political “voice.”
Finally, two broader preliminary points. During the entire period of “modernity” thus far, the state has been the key institution in generating and embodying “modern” ways. Indeed, in many respects the development of a strong executive function was a prerequisite for “modernization” along with the other two key components: accountability and rule of law. But the centrality of the state has changed over time, becoming stronger in the late-19th and early 20th centuries, but waning in more recent decades. The relationship between state and society is still changing, and the relationship between them that conduced to “modernization” a century ago may no longer readily apply. Rigidly centralized state structures may not be the optimal formula for successful “modernization” in the years ahead; netcentric, distributed federalized arrangements may become optimal instead.
Perhaps most important, developing nations may soon face a similar problem as developed ones: the end of work. Automation, still at a relatively early stage, has the potential to make the majority of the human work force in a society unnecessary to the production of basic value, to and even beyond the satisfaction of basic needs. But to work is to have dignity: There is no model of a successful human civilization where people are superfluous to the well-being of the commons. It is a mistake to conceive of what we usually call “unemployment” in strictly materialist terms, as the underutilization of a factor of production. No quantitative measure can capture the meaning of work in civilizational terms. This is a challenge—whether very great or less so we are still trying to understand and anticipate—that developing and developed nations are fated to meet together in the decades ahead.
Human Capital
The term “human capital” is in some ways an unpleasant one, because it implies the instrumentalization of human beings for some purpose often construed to be beyond them. It is perhaps not as repugnant as speaking of the “opportunity costs” of labor or of casually being “invested in” a person, as is common in English. But unpleasant or not, it is the term used to describe the power of education for purposes of political and economic development, and no obviously better substitute exists—so we will use it.
As we have substituted mechanical power for human labor over the past several centuries (indeed, for longer than that if one considers the ard plough and the millstone powered by domesticated animals), so in more recent times we have been augmenting mechanical power with conceptual innovation. This is not the place to detail all that is involved in these tectonic social and economic shifts; suffice it to say that what people generically call the “knowledge economy” is real, and its significance is growing. This is why societies playing catch-up to the global leaders using an industrial template find themselves falling further behind relatively even as their absolute achievements demonstrate success.
Education is a very broad and elastic term. It begs the questions of what kind of education, for what purposes, for what array of distributed tasks in a society. It also begs the question of where and how education in various forms should be acquired in a “modernization” project. It is a good idea, for example, to send significant numbers of promising students abroad for higher education, or is it better to develop quickly those educational institutions on home soil? How important really is elementary education, if the number of key innovators and scientists a nation needs to achieve top-rank is really quite modest? These are good questions, and not even in the United States and other advanced countries are there stable consensus answers to them.
Here we emphasize only two points. The first, which we can dispense with very quickly, is that knowledge-based economies require educated people at all levels, so elementary education is as important as secondary and higher education. People need to be educated as consumers as well as producers, as contributing to social infrastructure as parents and teachers as well as making products and offering services.
The second point is just as important, but is far more challenging to explain and illustrate. It is well known the world over that forms of traditional education that stress memorization and unquestioned obedience to the authority of teachers is not the same as education that stresses giving students the ability to think critically and creatively. The aforementioned advent in the West of individual agency over communal agency is deeply relevant here. So is the pluralization of religious belief at a certain point in the history of Western Christianity—mainly the 16th and 17th centuries—which inadvertently made a virtue of doubt, and from what was basically an historical accident generated a concept of genuine tolerance for intellectual and theological diversity out of what had before been merely forbearance. The two are not the same, and there are cultures in the world that still lack that development.
Also critical to the development of Western “modernity” was the entwined impact of the Enlightenment in the West, which grew conterminously with the Protestant Reformation. The “age of reason,” a term of art for the introduction and spread of scientific rationality in Western societies, in time created the specifically “modern” concepts of both “causality” and “factuality.” This is not to say that similar concepts did not exist in other cultures that antedated the Enlightenment and the Reformation. That is a Western conceit born of ignorance, or a lack of interest in others, or both. Nonetheless, the form that the mingling of early Protestantism and the Enlightenment took led to very particular concepts of causality and factuality that became bound up with what it meant to become “modern,” and that is thus associated with the wealth and power that Western modernity has manifested now for quite a while.
The scientific-rationalist idea of causality is instrumental, not fundamental. It is not interested in first-mover causes, namely the character of creation. That matter the Westerners left to theologians in a social world increasingly characterized by intellectual moduality. Other models of causality are interested in what is fundamental, meaning that it is not easy to divorce philosophy from pragmatics. Philosophies linked to theologies which insist that the prime or first cause continuously recreates the universe each second are not compatible with instrumental rationality. When the two became separate in the West, it became possible to credit the virtue of experimentation for the sake of sating curiosity, and for the practical application of knowledge. Absent any church-based tests of such experimentation, freedom of inquiry led to massive examples of practical innovation.
Experimentation, and the human spirit that animates it, is critical for development. A dissatisfaction with the status quo is not as important as the excitement that comes from discovery, and discovery requires an experimental attitude. A theology-directed way of thinking holds that certain abstract verities are true and will always be true, so that one can go from principle to application directly. This is analogous to the Western modernization assumption that pure theory could be translated down into universal development praxis. This is mistaken. As already noted, because of cultural and factor-endowment diversity, there is no one road to success. So it is necessary to try out many ways to solve an identified core problem, to see what works, and to patiently move to prototyping methodologies before scaling up to full level effort.
The Enlightenment also led to the idea that there were discrete data in the world—facts—that existed separate from what anyone felt about them. Again, when no theological test was applied to whether the earth rotated around the sun or the other way around, it was possible, eventually, to make rapid progress applying new scientific instrumentation (the optics of telescopes, for example) to discoveries about the natural world. There are still places in the world where the idea of an objective fact—and that can include the social facts of demography and other disciplines—that is separate from how anyone feels about it has yet to set firm roots.
These specific concepts of causality and factuality led to the pluralization of the idea of truth. Yes, people in the early modern West believed that the world made sense as a whole. They understood and believed in a first cause. But truth was nevertheless divisible for discrete purposes, and experts could exist in more fields than one, such that philosopher/theologians could not command or interfere with what other experts thought or did. This pluralization of intellectual pursuits was critical to the successful material development of the Western nations, there can be no doubt about it.
Yet even more basic to the processes we are summarizing is an often-overlooked matter: literacy, and in particular what we will call deep or full literacy.
The rise of individual agency appears to depend on the advent of interiority in a person—the sense that a person has an inner essence that defines his or her individuality. This is sometimes referred to as “the narrator” within, the “voice” in our heads that we silently “hear” and that we as individuals know as the inner “me.”
This seems so common a cognitive facility and experience, at least in Western societies, that we can draw attention to it and we can and do discuss it in both common and intellectual language. But in earlier human history, and in some places still today, the boundary between the interiority of individual consciousness and the social connectedness of the person to the larger community is not the same. All people are social animals and all have some sense of their individuality, yes; but the boundary between the two is not universally set or static. What creates this keen sense of interiority, which is preface for the rise of an autonomous and creative sense of individual agency, is deep literacy.
Literacy is not just one thing; it is layered, so to speak. To be able to decode written symbols, as on a list for example, is one thing. That is not deep literacy. To be able to write one’s name or make lists is not deep literacy either. Deep literacy is what happens when a person engages with a book or an extended piece of writing in such a way as to be able to anticipate an author’s direction and meaning, and to bring to bear what one already knows to the reading of a text.
Deep literacy has been aptly described as a revolution in the brain. Acquiring the ability to read deeply actually shapes the brain’s neural pathways, the more fundamentally so the younger the new reader is. It enables a facility with abstract logic. It nurtures the imagination. It allows people to experience people, places, history, and ideas that are simply unavailable through direct experience. It changes a person’s life in a profound way.
Consider the development of the relationship of narratives to thinking in a child—at least in mass-deep-literate societies. The process starts when a child sees a book without being able to decode or read it, but the book is read to them by a parent, other family member, or a teacher, creating a link between the written word, the phonemes the words make, and the meaning carried by the phonemes. Then a child learns how to decode writing and can pronounce it, but only aloud. As decoding matures into true reading, a child will still not usually be able to read silently. He or she has to articulate the words out loud to achieve reading, and there is often an intermediate stage to full deep literacy when a child is whispering and moving his or her lips while reading. Only later can the more mature child advance into silent reading, and gain the ability to meet an author half way in the dynamics of deep literacy that are common to literate adults.
Consider too that the only time a person can be alone with serious ideas brought to him or her by others is in reading. Otherwise, one is engaged in dialogue with one or more other people: one experiences the community, in other words, simultaneously with the ideas. Deep reading alone is the key source of the interiority that is the hallmark of “modern” humankind, and this is clearly an achievement of culture. There is a genetic capacity for understanding and articulating oral language. Any child can do it, and almost all do. But there is no gene for reading and writing. That is a cultural development, and no culture can be “modern” in the sense that Western societies became modern without a critical mass of population that is deep-literate.
Moreover, no society can be politically modern—have a competent executive, formal accountability, or real rule of law—without a critical mass of population that is deep-literate, for all these institutions depend on the intersubjective grasp of abstractions. In the West these concepts have become second nature. But many societies still lack a mature capacity for abstractive thought, and so tend to be far more “concrete” or literal in the way they think.
This is a fact, like it or not. Just walk into the average or mean/modal private home in various countries, and you will hit upon a clear and undeniable correlation. In “modern” societies, there will be many books in the house—children’s books, fiction books, reference books, still other books, whether the standard printed kind or, increasingly these days, books in electronic format. In patrimonial, premodern societies, there will be few if any books in the average home. Within both “modern” and modernizing, transitional societies, this distinction is also clear: Educated people, usually the elites, have books around them; the members of mass of society typically do not.
What does this mean in practical terms? It means that young people must be trained to be deep-literate, and must be so trained in their own language, in their own cultural idiom. If a society cannot achieve this, it cannot be truly wealthy and secure in the 21st century, for other societies can and will achieve it, leaving those that cannot at a major disadvantage. This, again, is not the same as simply decoding written symbols. Most of the official data on literacy that governments send to the United Nations, or use themselves when they deal with the World Bank, for example, do not make the critical distinction between simple decoding and real literacy. These data are therefore all but worthless.
And it means that every teenage student in a developing nation must have access to an earth-science class and lab, with a teacher who knows how to use that lab. Learning to observe is the key to scientific rationality, for again, science is not a mere accumulation of facts but a process, and a distinctly human process. No society can be “modern” in a Western sense or in any sense with a population that does not know how to observe reality and reason about it without constraint or artificial impediment. Applied science is the portcullis to genuine modernity, the more so as time passes, so that a society whose people cannot participate in that pragmatic adventure will not be able to keep up with those societies whose people can. And no one can do applied science and gain from the experience if he or she is not deep literate.
Social Trust
Human beings are social animals, and like all social animals our natures comprise a wondrous mix of impulses to both compete and cooperate. Competition helps motivate us; cooperation constrains us as individuals from being too motivated for our collective good, and more important perhaps, provides an array of benefits to individuals that individuals cannot achieve on their own.
One way we strike a functional balance between competition and cooperation as a community of animals is through our knack for quickly sizing up the intentions of others. This we do though speech and a host of more anciently honed paralinguistic cues inherent in facial expressions and body language. We can glance at other people, as we glance for danger from predator animals and natural hazards in our ambient environment, and get a sense of whether we are bound to compete or cooperate with—or merely ignore—that person or group of persons at any given time. We will either recognize the appropriateness of the competitive or cooperative frame in given contexts and engage, or sense inappropriateness and chose another response.
Repetition and routine greatly help us to reckon, so it stands to reason that we get good fast at recognizing intentionality in those we interact with most often—family members. In a civilized context, we get good at estimating the intentions of others beyond the immediate circle of family to the extent that cultural sharing is deep and wide. It tends to be deepest within extended families (called tribes or clans by anthropologists), getting shallower as relationships become more distant. It tends to be deep among friends who live close to one another and engage in similar or complementary ways of making a living; propinquity encourages what evolutionary biologists call reciprocal altruism. It attenuates with loser bonds.
Culture sharing in large agglomerations of population, like modern nations, usually depends most on a common language, but also on a thick symbolic sharing enabled by various narratives to which members of the society are socialized. That sharing, which presupposes its function even among individuals who never lay eyes on one another, in turn depends on widespread literacy. Tribes and tribal confederacies do not evolve into “modern” nations without widespread literacy.
At their inner margins, too, human societies supplement sharing with what are often dismissed as mere manners, or etiquette, that allow relative strangers to share ways to gauge the intentions of others over a wide range of social interactions—trading, sharing a meal, socializing, forming a militia or a fire brigade, seeking and providing medical care, and so forth. Contrary to common understanding, then, glancing actually involves a deep penetrating assessment of our environment, and manners have a social history and purpose that is anything but trivial.
The essence of all this is that people who share common mazeways from having lived together and having been socialized generation to generation can anticipate what others are up to, and this results in a network of reciprocal shared expectations. Over time, these expectations become reified into norms, usually sacralized by religion into an articulated (and ultimately written) moral code. Informal norms and religious codes have become embedded ultimately in secular law, at least, in the Western world.
What this means, as Edmund Burke, Adam Smith, and many others of that era in the West understood, is that below the level of formal law—in the British and then the American case it was customary law—lay centuries of informal development at whose root was the social network of reciprocal shared expectations. The most obvious example to Smith had to do with routines of trading commodities and services being the clear prerequisite for, first, elements of canon law and then the development of formal legal contracts. Put a slightly different way, the Yale University political scientist James C. Scott insists that:
“Most villages and neighborhoods function precisely because of the informal, transient networks of coordination that do not require formal organization… [T]he formal order of the liberal state depends fundamentally on a social capital of habits of mutuality and cooperation that antedate it . . .”[3]
The existence of social networks of reciprocal shared expectations within a given community is what “social trust” means as a locution of contemporary social science. Having it conduces to efficient, reliable, and hence more voluminous exchanges at low transactional costs, hence adding to general prosperity via the balm of local comparative advantage. Having it allows people to relax a bit, take some basic stabilities of social relations for granted, and hence reduce the cognitive load of day-to-day transactions. Having it usually conduces, too, to a sense of shared responsibility for the commons, and hence to spontaneous informal cooperation in the face of challenge or miscellaneous perceived need from which everyone may benefit. In short, social trust facilitates transactions, provides a secure nest for the expression of individual creativity, and enables collective action for the common weal. Social trust works a bit like Adam Smith’s famous “invisible hand”: many small capillary-scale behaviors adding to a functional whole in such a way that no planned, top-down conscious design could possibly produce.
Collective action can includes security considerations—“eyes on the street,” as Jane Jacobs called it in a modern urban context in her famous The Death and Life of Great American Cities (1961), a concept related as well to James Q. Wilson’s famous “broken windows” insight. And it includes a sense of there being a certain seemingly natural common responsibility to care for the weakest members of the society: the very young, the very old, the ill, and the infirm. Institutional extensions of faith communities traditionally took up this burden in the West.
It is important to stress the role of social virtues—norms that the culture holds up as character traits to which individuals are bidden to aspire. As Francis Fukuyama put it: “The social virtues, including honesty, reliability, cooperativeness, and a sense of duty to others, are critical for incubating the individual ones. . .”[4] Wrote another social scientist, “Social life takes up and freezes into itself the conceptions we have of it,” so when social life produces conceptions of proper behavior and positive character traits, individuals through their behavior become the intertwined expositors, carriers, and transmitters forward of those conceptions.[5] Not all such conceptions, moral and otherwise, have obvious or direct implications for prosperity or communal security, but many obviously do.
Again, with reference to Burke and his era of liberal thinkers in the West—as the word liberal was used then—a community that takes care of itself in this highly interpersonal and relatively “flat” manner does not need the heavy hierarchical hand of government to do such things for it:
Men are qualified for civil liberty in exact proportion to their disposition to put moral chains on their own appetites… Society cannot exist unless a controlling power upon will and appetite be placed somewhere, and the less of it there is within, the more there is without.
The conclusion, as it moves from the domains of anthropology and sociology to political theory, almost states itself: The ballast provided by social trust enables a relatively light government to do for a collection of communities in a larger country only what they cannot readily do for themselves. Hence abundant reservoirs of social trust conduce to government that is neither greedy of its purview and hence is not just limited but self-limiting; that conduces to “flatter” federal arrangements; that is not overbearing in its demeanor (read: authoritarian); and that is not the usual early-modern mix of mercantilist and oligopolistic in its method of self-funding, but rather social-contractual.
Things do not always work out that way, or do not stay that way. Any given nation’s supply of social trust can be vitiated by a host reasons: excessive immigration-driven or simply preexistent heterogeneity, technology-driven isolation, the backwash of institutional dysfunction, the media-driven “mean world syndrome,” family instability and breakdown, the decline of traditional religious mores, and the excessive intrusiveness of the state. Of these, the last is perhaps the most pernicious, but subtle.
Thomas Hobbes was half right and half wrong to call life in the absence of a sovereign (read: the state) “nasty, brutish, and short.” Yes, humans are competitive, but that is not all they are. Jean-Jacques Rousseau was also half right and half wrong to believe in the noble savage whose nobility is effaced by the depravities of civilization. Yes, humans naturally cooperate, but that is not all they do. As already noted, the sinews of normal social life are composed of informal mutualities arising from the habits of culture, and the sum is designed to constrain and channel both our competitive and our cooperative nature. This is, again to briefly repeat, the organic social glue that enables a liberal form of government to be limited, and self-limiting.
Should that glue dissolve for whatever reasons, a governing elite has two choices: It can substitute formal institutional authority for depleted informal mutuality; or it can try to find a way to restore informal mutuality. Left-of-center types in the West tend to prefer option one, right-of-center types prefer option two if they can think of a way to affect the restoration. The trend in recent decades in Western countries has been toward option one, and it may be that an enlarged state does not so much follow from Hobbes’s take-no-prisoner egoist so much as create that egoist. We quoted Scott above; now let us see more of the same quote:
[T]he formal order of the liberal state depends fundamentally on a social capital of habits of mutuality and cooperation that antedate it, which it cannot create, and which in fact it undermines. The state, arguably, destroys the natural initiative and responsibility that arise from voluntary cooperation. Further, the neoliberal celebration of the individual maximizer over society. . . encourage[s] habits of social calculation that smack of social Darwinism. . . . [W]e are in danger now of becoming precisely the dangerous predators that Hobbes thought populated the state of nature. Leviathan may have given birth to its own justification.[6]
In other words, spiting Tocqueville, Americans in particular may have foolishly deprecated their own communalist past in the name of a ideological manqué of individualism, leading to more intrusive government to mend the resultant tears in the social fabric, leading in turn to the destruction of the basis for creative individualism. The result of intrusive government, then, is not to substitute in a positive way for the loss of the organic communal reciprocal expectations of behavior, but to further erode them. The liberal state can suffocate the social basis of its own legitimacy, the wellspring of its own vitality, when it fails to be sufficiently self-limiting.
What this means for societies seeking to become more “modern” is clear: Government must not destroy the diversity of a society’s social vernaculars in pursuit of a forced-march modernization campaign. If these social vernaculars are destroyed in search of some rapid progress, and that search fails, it may not be possible to put the old system back together again. So it would be most ironic if such elites, occasionally royal families arising within tribal societies, end up mimicking the failures of the authoritarian command vanguard party.
What it means, too, is that at a time when “flatter” distributed systems, or networks, have an advantage over rigid centralized structures, Ataturk and Reza Shah are not any longer models to emulate for Middle Eastern modernizers. Any modernizing elite that tries to centralize governmental management in a society not used to it will encounter pushback from sources of social authority rooted in local vernacular approaches to social order, as in Afghanistan. Any elite that tries to import institutions thought conducive to rapid political modernization will also run into trouble. The imposition of electoral democracy on a still-largely tribal based, communal-agency society like that of Kenya risks violence and near civil war almost every time there is a national election.
This means that in many countries, it is wiser to reform agriculture in rural areas before trying to create a forced-march industrialization in urban zones. This is what China did successfully; it slowly marketized agriculture, experimenting with methods until those that worked were checked, and then built rural industry in that foundation. Since interventions in dynamic webs are inherently unpredictable, one must experiment before scaling up, and rural environments can be ideal laboratories for this.
In general, too, the idea that rapid change is socially stabilizing is one of the stupidest ahistorical ideas that political leaders tend to have. History shows that rapid change, even and especially rapid economic growth, is higher destabilizing. This is in part what Josef Schumpeter meant by his famous phrase “creative destruction.”
Modernizers need to build patiently on the vernaculars they have inherited and that are organic to the culture. This may take longer to achieve desired results than the forced-march approach, but it is more likely to be sustainably successful, and successful without relinquishing the sense of unique corporate identity that means a great deal to most people. Modernizers in too much of a hurry often have reason to regret their lack of patience.
Institutional Coherence
Without sufficient human capital and social trust, sound institutions have a hard time developing. This is why many developing countries are plagued with administrative competencies too weak to govern large political entities. This leads to what the late Mancur Olson once called “the reimportation of imperialism” in the disguised form of international institutions like the World Bank and the foreign aid programs of wealthier, more technologically advanced countries.
But even with adequate human capital and social trust, institutions only function as designed when attitudes and institutions correspond. Without the legitimacy that comes of that match, institutions work but poorly. So, again, a careful building out of existing institutions that have the benefit of legitimacy will generally work better than institutional shock therapy. What this means is that, in practice, sociology is more important than economics.
Of course, there are times and cases when existing institutions stand athwart of the purposes of a modernizing elite; hidebound clergy, for example, can frustrate a great many positive innovations. In cases like that, leaders much ponder the methods and means of making a change without provoking counterproductive resistance. This is never easy, and general advice is hard to come my: Every case is different, and only locals who know their own environment as if by intuition can make those judgments. It is folly to rely on foreign “experts” in such cases.
One general observation can perhaps help, however. And that is the insight that government organizations can usually do only what they were set up to do and are experienced in doing, for better and for worse. To get bold new policies and improved policy outcomes, it is usually necessary to innovate in government delivery systems. Put a little differently, the structure of government presupposes its function, so if leaders want the functions to change in order to get improved policy outcomes, then the structures have to change as well.
It is one thing to design properly a single institution. Effective “modern” government needs to do more than that: It needs to develop relationships among institutions so that their whole is more than the sum of the parts. Educational institutions, the military, the courts, scientific establishments and universities, business leaders and more all need to learn how to work with one another rather than against one another, as so often occurs in government contexts when it is time to allocate resources in the budget. Reservoirs of social trust help in this process, but more than that is usually necessary.
One way to achieve institutional harmony is for government, in the most light-handed way it can manage, to align incentives in such a way as to reward cooperation. For this purpose, there is nothing wrong, and often a great deal right, with governments managing financial markets to spur competition and create functional incentives.
Money helps, but so does status bestowal. So another way to proceed is for government chaperones, so to speak, of civil society institutions to rotate from institution to institution, forming a senior executive service that has experience of different institutional setups. Knowing how to create protocols for institutional cooperation is a critical specialty in government that is often overlooked and hence hard to come by in time of need. Troubleshooting and mediation are valuable skill sets, but they work best when institutional leaders are led to own the solutions to problems, rather than having those solutions forced upon them from above.
In the end, there is no real substitute for enlightened leadership at all levels. Wise governmental elites seek to put trustworthy people in key positions, and then have the good sense to get out of the way and let such people show their talents. Over-management from above is the worst way to get institutions in society to cooperate with one another; pockets of self-reliance at key levels in a hierarchy are what work best. Certainly, built-in and built-up resentment is toxic to a cooperative inter-institutional set up. Teamwork requires the cultivation of respect and trust, and a lot of practice—and this, again, often takes a fair bit of patience. Finally in this regard, in society in general and in the cadres of government cohesion and trust is more important that participation. The idea that everyone must have a say—democracy in society in general, endless committee meetings in organizations—is much overrated. People need for their interests to be taken into account, and to be treated with basic dignity. That is not the same as floods of participation in every last detail of social, business and government life.
Enlightened leadership is a rarity; it is today, and it always has been. Having a clear concept of what one wants for one’s people is critical, and that concept is wisest which takes the full breadth of human social nature into account. That requires, in turn, the humility to understand that all good things do not always go together; tradeoffs are inevitable, and to resist admitting that is a formula for exhaustion and failure. Priorities must be established and adhered to, and above all patience must be exercised as the master discipline.
It is a kind of paradox to say that things are always changing, always evolving in social life, and yet admit that change can be so slow as to be imperceptible. Yet both are true. A wise leader knows he cannot quicken the tides, but that if he is skillful enough he can navigate them safely and chart a desirable destination. Mortals can do no more.
[1] Huntington, Political Order in Changing Societies (Yale University Press, 1968).
[2] By complex market capitalism is meant the existence of institutions such as banks and other financial mediators, futures contracts, primary and secondary insurance arrangements, and, above all, the existence of stock and bond markets and market-funded partnerships known as corporations. Obviously, smallholder and “shopkeeper” market arrangements do not require these accouterments.
[3] Scott, Two Cheers for Anarchism (Princeton University Press, 2012), pp. xxi-xxii.
[4] Francis Fukuyama’s Trust: The Social Virtues & the Creation of Prosperity (Free Press, 1995), p. 43.
[5] See generally Erving Goffman, Frame Analysis: An Essay on the Organization of Experience (Harper & Row, 1976).
[6] Scott, pp. xxii-xxiii.