This is the right time and right place for cognitive data, but our institutions will resist this transformation.
The interpretive framework for society for most of the past several thousand years has been some variant of feudalism (as least since the hardening of social and political power under Tiberius, the tristissimus hominum); identity and value depended on hierarchical relationships. Feudal organizing principles persisted into the 18th century (cf. Blackstone’s Commentaries on the Laws of England, wherein crimes and punishment depended on social standing within the feudal hierarchy). To a significant degree, this feudal structure survives today.
Ancient medicine was premised on a balance of black bile, yellow bile, phlegm and blood, but by the late fifteenth century, the West was ready to move beyond Hippocrates and Ptolemic balances to medieval surgery. The transition is apparent in the contrast of Da Vinci’s Vitruvian Man in 1485, which was based on ancient Roman architect Vitruvius’s ideal proportions of human symmetry (balance) that mirror supposedly universal symmetry, to his drawings of dissections of human corpses and detailed reports twenty-five years later. From Vitruvian Man to the posthumous publication of Da Vinci’s drawings in De humani corporis fabrica in 1543, the West began a push from top-down theories of harmonious balance to bottom-up empiricism. This new empiricism would elevate observation to drive conclusions and challenge the feudal construct of theory driving observations. The challenge was quick and violent; Martin Luther’s 95 Theses of 1517 is largely a list of observations; the church, and much of Europe, was taken aback by a man who dared to observe.
The height of the challenge of empiricism is the U.S. Declaration of Independence. It begins with an hypothesis (“all men are created equal”), and then follows with the bulk of the text: a list of observations. Like 95 Theses, the Declaration is largely a document of observation. The observations lead to what the colonists felt was an obvious conclusion. The contentious debate by historians that followed over the nature of the Declaration is the result of applying post-empirical analysis that applies theory-driven interpretive structures to a document that is non-ideological. Similarly, the U.S. Constitution is not ideological (hence the conflation of liberty and slavery). The U.S. founding documents are products of observation.
But the progress of empiricism was impeded by systemic barriers: human interest in measuring usually vastly exceeds human capability to measure, and human desire to observe is usually much greater than human capacity for objective observation. An additional problem with engaging this new empiricism is the human fetish — the leitmotif of nearly all inquiry — for balance. And so as the feudal interpretative framework weakened in the eighteenth century, those seeking to replace the feudal framework began to generate their own theories and interpretative frameworks, much like the ancients did, in order to guide observations. The process begins with creating a theory (interpretive framework, data categories, intended outcomes), making and affixing labels, and then seeking data confirmation for the theory. The theory drives the data and allows those to employ the framework to seek and interpret data to build narratives, which gild reality with balance and allocate power. The progress of empiricism reached its zenith — noticeably far from European shores — with the Declaration of Independence, but it otherwise never really had a chance against the fetish for balance.
Limited by human measurement capabilities, empiricism needed a way to circumvent the need for high quality observation. While the concept of binomial nomenclature had been circulating Europe for a century, it was Linnaeus in the 1730s who captivated audiences in lectures and books by systematically categorizing the natural world. Arguments from strict categories — capitalism/socialism, left/right, Christian/pagan, marxist/free market, black/white — all stem from the Linnaean interpretive conception of the universe. Everything could have a succinct definition, placed within a nested hierarchy, and referenced to a type specimen — the specimen that reflects the archetypical example of the definition. The path from feudalism to modernity was paved with top-down binomial nomenclature taxonomy — e.g. one hierarchical interpretative framework was replaced with another.
Linnaean efforts took its next great leap forward with Diderot’s encyclopedia project, which would organize and categorize “each and every branch of human knowledge” in order to “change men’s common way of thinking.” Diderot’s Encyclopédie eventually consisted of 28 volumes, with 71,818 articles and 3,129 illustrations. In Encyclopédie, Diderot wrote “The goal … is to assemble all the knowledge scattered on the surface of the earth, to demonstrate the general system to the people with whom we live, & to transmit it to the people who will come after us, so that the works of centuries past is not useless to the centuries which follow….” Thus the supposed departure from the past was demarcated by enormous effort to systematically re-categorize everything. These systems and categories and labels would provide the framework for which the moderns would seek data — proof — and build narratives to replace feudalism. The height of this framework was the Declaration of the Rights of Man and of the Citizen (1789), which was, opposite of the Declaration of Independence, a work of theory, not observation. The Enlightenment — if it ever began — was thus over.
Such frameworks — theories and ideologies — are represented as and thus often perceived to be theoretically or ideologically objective but are not de facto driven by observational objectivity. Modern peoples have consistently interpreted data and events through the filter of theory, including the American founding (Linnaean labeled as “free market representative democracy”). Theories are often gilded as ideology and thus rise to the level of empowered agents of coercion.
In the early 19th century, frameworks for interpreting data increased in kind, quantity and complexity more than did the ability to gather the objective data to support such frameworks. The quantity of data and use of interpretive frameworks exploded in the late 19th century, particularly in the sciences and social sciences (e.g. Darwinian evolution, Marxism, phrenology), and as the interest in balance (or restoring balance, which is a close narrative cousin of fairness) continued and (largely unincorporated and poor quality) data quantities increased, so did theories — of government, education, individual human cognitive abilities and macro economic national capabilities. (The bell curve, an artificial construct borne of the balance fetish, became popular during this time.) In education, we’d take a theory of learning, derive learning progressions from it, then fit in data.
Such fictional narratives (utilized in lieu of empirical observation) were explored by Walter Lippmann in Public Opinion (1922). Lippmann was skeptical of the potential for democratic society given the cognitive limitations of people and the potent force of stereotypes (labeling) and propaganda (narrative construction). Lippmann argued that humans inhabit narrative fictions to adjust for perceptual parallax, and, while such fictions are leveraged by corporate and political media, the primary catalyst for narrative construction is personal intellectual disengagement (e.g. people are too lazy to engage in empiricism with consistency, so they filter reality through narrative and basic signifiers — labels, stereotypes, etc.). The active pursuit — not mere tolerance — of narrative fictions is thus a result of intellectual indolence and an effort to forestall cognitive dissonance. Lippmann, however, was an elitest who viewed such narrative fictions as a framework from which the government and its elites could inform — that is, control — the masses. For Lippmann, the interpretive framework was not a problem but an opportunity. (This is precisely the same view as Stalin’s and Hitler’s.)
And yet as soon the era of theory began to achieve dominance across all modes of knowledge production, it was challenged again by empiricism. By the 1930s, scientists working in quantum mechanics began to more formally assert doubt regarding particulate measurement. The difficulties of measurement from quantum mechanics weren’t in themselves too important — no one thought the particles of an object were going to swing into anarchy and your chair would collapse. But the idea proved important: even our improving abilities to measure may be insufficient, and our ability to make grand statements may be limited by not just our limited ability to measure but even our understanding of concepts of measurement. At the same time, historians likewise began to retreat from making macro observations to instead make a few micro observations; history grew fond of exploring minor individuals (social history) instead of the grand histories of nations and ideas. The occasional and reluctant withdrawal from grand theory was a defense mechanism that wasn’t necessarily more accurate; the increasingly limited purview of historians during this time — focusing all their observational powers on small subjects — should have increased accuracy, and perhaps at times did; however, the predilection to operate under interpretative frameworks resulted in interpretive filters that resulted in historical pictures of interpretive frameworks — the data was the paint, the framework was the subject — instead of pictures of empirical reality (the defense, there, was that such reality does not exist — French philosophers have a defense for every intellectual fetish).
The Complexity of Reality
And yet, the challenge of empiricism remained. As the nineteenth century continued to produce more data, measurement limitations were aggravated with limitations imposed by quantity. The rate of growth in quantity was far greater than any capacity to analyze the quantity. Even with the advent of servers in the 1960s, the process was limited largely to collecting and sorting; analysis at such quantities remained elusive. The incentive for analysis may have been additionally adjusted by the implicit observation that the quality of the data is often suspect. The grand theory school was thus granted the opportunity to continue to build narrative, culminating the the top-down stepwise design and systemics philosophy of the mid-twentieth century and financial theories such as the Black–Scholes–Merton derivatives pricing model.
The end of the twentieth century witnessed the collapse of the reign of universal interpretive frameworks. The Soviet Union — built on a late Victorian theory of history — collapsed and the darling of the Black–Scholes–Merton model lost investors almost five billion dollars, requiring a bailout from a consortium of banks. (The Black–Scholes–Merton bailout occurred the year after its creators received Nobel prizes, which often is nothing more than an award for best advertisement for fictional narratives.) The collapse of interpretive frameworks most noticeably signaled by the fall of the U.S.S.R. was analyzed by Francis Fukuyama’s 1989 The End of History?, which, in an act of seismic confirmation bias, interpreted the collapse of one interpretive framework from within another interpretive framework. This deep irony was repeated on scales large and small over the next decade; top-down interpretive models were dying, and other interpretive models would pick over the corpses until they too became carrion.
In the first decade of the twenty-first century, we began to address the twin challenges of empiricism: collecting data of analyzable quality and analysis at scale. Upon reflecting upon data quality, what we discovered is something anyone on the ground in data collection likely knew: our data sets are not very good, even today. The quality of our data was capable of fueling the capacity for general analysis, which sometimes are heuristically good enough, but rarely can they be relied upon for precision analysis. Economic and financial data, crime data, education data were all incapable for proving solely enough to drive actionable analysis. The collection and reporting of the number of murders in Chicago, or Venezuela, or Iraq is, in fact, quite poor. (The official number of murders in a municipality relies wholly on the mechanisms for such acts to be known to officials, which may or may not be an accurate reflection of the actual number — usually, it’s not.) We’ve discovered that we don’t analyze what good data we have, we collect a statistically significant amount of bad data, and there exists gaps in our understanding for which we have no data at all.
We are, though, at the crossroads of top-down, theory, interpretive framework or ideology narrative construction and bottom-up empiricism that reject a priori narrative. We can, for the first time, begin to collect and process data and build models, like Martin Luther, from the data up. We can build from observations — which we’ve deluded ourselves into thinking we’ve been doing the entire time. Governments can develop regulation not based on a theory of how things should be but rather on observations of how things are. Empiricism’s 228-year hibernation is over, and the era of ideology may be over. Those who build narrative from ideology will not go quietly, and the capability to collect high quality data will require fundamental changes in physical, social and cultural infrastructures. The process to re-engage empiricism will likely take generations, just as the process of originally igniting the process did. Our modern institutions — government, academic — are empires of theory, narrative construction and interpretive frameworks. Some will sense an existential threat where others sense opportunity.
The Protestant Reformation did not begin with a theory but rather with a series of observations; Luther had no intention to destroy, replace, or build anew. He intended 95 Theses to be a privately circulated inquiry into the nature of the church. It was his friends who published it. After the press, ideas become things unto themselves with their own decentralized ability to grow; such ideas could be fought and suppressed, but they were no longer fragile. The Reformation was an act of bottom-up empiricism that was quickly followed by top-down narrative construction. Implicitly and explicitly, Linnaeus and Diderot began a great transfer of power from the Ancien Régime of church and landed aristocracy to a new regime of state, academy, and urban elite. Linnaeus originally listed humans as monkeys with knowledge, and though he was forced to re-categorize humans into their own genus, he set off a century-long debate over whether humans were simply more reasonable monkeys. Diderot’s efforts to establish a new elite are transparent as his taxonomic structure was based on reason and not theology, and the entry Theology is under Philosophy and the entry Knowledge of God is conspicuously placed near the entry Black Magic. Diderot’s argument was clear: taxonomy had now been weaponized and targeted at the Ancien Régime in a battle over the very definitions of existence. French authorities placed an official ban on the Encyclopédie to appease those being attacked, but they otherwise permitted the grand organizing of the universe to proceed. Taxonomy and language was thus transformed into the siege weapon that provided fuel and justification for the French Revolution, as Marx recognized when he selected Diderot as his “favourite prose-writer.”
The century following Linnaeus and Diderot witnessed an aggressive effort to organize and categorize the universe that proceeded with fervor as social engineers and the new elite viewed each new generation as a tabula rosa on which they could restructure society; and each new category further attempted to limit fluidity and permeability by ascribing the dogma of taxonomy. This new taxonomy of knowledge and the resulting lexicon would be the hammer that would dismantle the church and state and enable a new class of elites to assume power. More importantly, this new lexicon would banish the old way of thinking — knowledge creation would be confined to the rigid taxonomies and acceptable narratives of the new world. Those who attempted to cross the newly demarcated strict boundaries were somehow aberrant, whether they were guilty of miscegenation or they were the feted dandies of the previous decade that had become the dangerous femininity of Oscar Wilde.
Taxonomy became fetishized in the Victorian era, as state and academy sought to solidify their new claims on power, and groups competed to categorize and define themselves and others. Increasingly narrow categories and complex hierarchies emerged as taxonomies for race, nationality, gender and existential value competed for positions atop the taxonomic pyramid. The state expanded its regulatory authority through increased precision and application; the law began to inhabit a new world of hyper-Scholasticism; both erected barriers of obscurantism with which to defend their Linnaean alliance with the press. Civil society aligned with this new taxonomic precision, from watches being widely available to literally ensure temporal precision to increased identification and political association with one’s class, labor, ethnic and religious genera. Fluidity and permeability were relegated to quaint hobbies or transgressive acts such as tarot card reading and spiritual mediums. As hobbies or transgressions, they were acts against the prevailing power, which was not the church but rather the state and the academy. Fluidity and permeability not longer operated within the West’s core institutions, and the shamanism that had been a central feature of nearly all cultures was soon completely dismissed with the opening salvos of World War One.
Fueled by Diderot and starting with the French Revolution, wars (civil and otherwise) were fought and won or lost on propaganda. If the popular consciousness could be convinced that the war was one of liberation and rights against oppression (and wrongs), then success in the land war would typically follow. Stalin and Mao both viewed conquest as a matter of population control, wars won by seizing and altering consciousness, wars openly fought to inject ideology via new taxonomy. The great taxonomic success of the French Revolution was to create the genus nation with the specimen type the French people. This genus was separate from the genus of church and monarch. Marx created and defined the genus class consciousness to affix to France’s national consciousness. This taxonomy fueled revolutions from Russia to Germany and across late colonial Africa and the Far East. Despite Marx’s fantasy, the genus class did not create an international proletariat because it had to fight for survival in the new ecosystem with the genus nation; an older fluidity may have enabled a different outcome, but taxonomy fundamentalism creates barriers regardless of other apparent similarities, as the Soviets and Fascists soon discovered.
The era of weaponized taxonomy is nearing its end; it is subject to and has failed the temporal test of fragility. We see now the last great miasmas of a power structure in the throws of failure. The question, then, is not whether empiricism will be revived but how damaging the resulting friction will be.
Nota bene. Of course there are substantive distinguishing characteristics among, for example, the interpretative frameworks of math and physics and those of sociology. One of those characteristics is typically whether the purpose (usually inherent) is the aggregation of social and political power (though always gilded with some supposed civilizational necessity such as knowledge production). Another distinguishing characteristic is whether the input is force/law/artifact of the universe or human observation of human activity. All to be addressed later.
II Nota bene. I know the English article before the Latin phrase at the beginning is redundant. The Latin does not require it but the English suggests it. So I apologize to all the Latin readers. Yes, all of them.
Formerly of Xio Research, an A.I. appliance company. Previously a strategy and development leader at IBM Watson Education. His views do not necessarily reflect anyone’s, including his own. (What.). Nathan’s academic training is in intellectual history; his next book, Weapon of Choice, examines the creation of American identity and modern Western power. Don’t get too excited, Weapon of Choice isn’t about wars but rather more about the seeming ex nihilo development of individual agency … which doesn’t really seem sexy until you consider that individual agency covers everything from voting rights to the cash in your wallet to the reason mass communication even makes sense….