Stranger Than We Can Imagine - by John Higgs

We have a theme emerging. That theme, seen repeatedly across the wide sweep of modernist culture, was the idea that a single viewpoint was insufficient to fully express or describe anything. It’s an idea that is already familiar to us. The core concept behind Einstein’s revolution was that there was no single perspective which can be considered to be correct or true, and that our knowledge of our subject is dependent on the perspective we take.

Einstein and the modernists appear to have separately made the same leap at the same time. They not only recognised that we are bound by relative perspectives, but they found a higher framework, such as space-time or cubism, in which the subjectivity of a single perspective could be overcome. In 1878 Nietzsche wrote that ‘There are no eternal facts, as there are no absolute truths.’ Einstein and Picasso were both offering their own solutions to Nietzsche’s complaint.

Why was the reality of the First World War so different to initial expectations? Why was it not over by Christmas, in line with most contemporary assumptions about the nature of conflict? The answer, in part, is technology. It was the first industrialised war.

It was scale that both created and ended the imperial world. Empires were born when population growth caused egalitarian structures to break down. They ended when technology had grown to the point where warfare could no longer be tolerated. The imperial system, it turned out, was not the unarguable, unavoidable system of human organisation that it had been believed to be for most of history. It was a system that only functioned during a certain period of human and technological growth. If warfare was no longer acceptable in the industrialised world then emperors, tsars and kaisers could no longer be trusted with the power they held. They had stupidly led the world into horror once, and could do so again. The concept of emperors, one of the great constants in human history, was finished.

The traditional method of regicide was not hanging or burning, but beheading. If you intended to kill a king it was necessary to chop off his head, as Charles I of England and Louis XVI of France unhappily discovered. This was highly symbolic. It was not just their actual head you were hacking off, but the head of the political hierarchy. An absolute monarch was the omphalos around which the rest of society orientated itself. Squabbles with ministers aside, law and sometimes religion were whatever that emperor decreed them to be. You might not have liked what an emperor did, but you understood that power was theirs. You knew what your role in the hierarchy was, and you orientated yourself accordingly. Without the omphalos of emperors, society was a jumble of different, relative, individual perspectives, all fighting for credibility and political power. This is what was so remarkable about the changes that occurred in the first decades of the twentieth century. The sudden departure of emperors across huge swathes of the planet was the removal of single, absolute, fixed perspectives. This is a story we’ve already seen played out in different arenas. Art, physics and geopolitical structures all underwent similar revolutions around the same time, for seemingly unconnected reasons. Politicians were wrestling with the same challenges that faced Einstein, Picasso, Schoenberg and Joyce: how can we proceed, now that we understand there is no ultimate perspective that every other viewpoint is subservient to? How do we reconcile contradictory positions? When our previous ways of thinking are fundamentally flawed, how do we move forward?

For those countries that did not take the communist route out of the failing imperial world, the trend was clear. Power could not be entrusted to absolute rulers in a world capable of industrialised warfare. The multiple perspective of democracy was safer than the single vision of an emperor. With those emperors gone, political power was redistributed into the hands of individuals.

Crowley was announcing a new, replacement omphalos. It was one which would come to define the twentieth century: the individual. When the bonds of hierarchy were shattered, you were left with the multiple perspectives of a host of separate individuals. In the philosophy of individualism the self is the focus and is granted precedence over wider society. Support for individualism had been slowly building for centuries. You can trace its roots back to the Renaissance or the English Civil War. It was boosted by the Enlightenment and can be found in the work of writers such as François Rabelais or the Marquis de Sade. But few people had been willing to take it to its logical conclusion.

The importance of individualism found particularly fertile soil in the United States, which as we have noted was always uncomfortable with the rigid hierarchy of empire.

Ayn Rand did not believe that concern for the wellbeing of others should limit personal liberty. With her striking short black hair, cold piercing gaze and ever-present cigarettes, she quickly attracted a dedicated following. Her individualist philosophy, which she named Objectivism, promoted what she called ‘the virtue of selfishness’. Like Crowley, she viewed her mission as the establishment of a new, post-Christian morality.

Totalitarian states may appear to be the antithesis of individualism, but that very much depends on whether or not you happen to be running them. In the eyes of Crowley or Rand, a leader such as Hitler was admirably exercising his individual will. This is another example of how what is observed is dependent on the position of the observer.

Like Rand’s ‘enlightened self-interest’ and Adam Smith’s ‘invisible hand’, Crowley also had a moral justification for his philosophy. Do What Thou Wilt, he stressed, was very different to Do What Thou Like. This was due to the nature of what he called True Will. True Will was distinct from normal desires, and defined as action that was in harmony with the universe at large. It was action that occurred naturally and which was not undertaken for ‘lust of result’. As he saw it, by being ‘in harmony with the Movement of Things, thy will be part of, and therefore equal to, the Will of God … If every man and every woman did his and her will – the true will – there would be no clashing. “Every man and every woman is a star”, and each star moves in an appointed path without interference. There is plenty of room for all; it is only disorder that creates confusion.’ This idea is essentially Daoist. Crowley took the ideas of the sixth-century BC Chinese writer Laozi and flavoured them with a Nietzschean, proto-fascist outlook that chimed with the early twentieth century.

Freud developed another model of the mind in 1923, which helped him illustrate how neuroses form. He divided the mind into three separate sections, which he called the ego, the super-ego and the id. The id was like a hedonist, seeking pleasure and desiring new experience. It was driven but addiction-prone and it was, by definition, unconscious. The super-ego, in contrast, was like a Puritan. It occupied the moral high ground, remained steadfastly loyal to the laws and social conventions of the surrounding culture, and attempted to limit or deny the urges of the id. The task of negotiating between the id and the super-ego fell to what Freud called the ego, which roughly corresponded to the conscious part of the mind. In some ways the name ‘ego’ is misleading, for the word is commonly associated with an exaggerated and demanding sense of the self. Freud’s ego attempted to compromise between the demands of the id and super-ego in a realistic, practical manner, and would provide rationalisations to appease the super-ego when the urges of the id had been indulged. The ego acted like a pendulum, swinging back and forth between the id and the super-ego as events changed. A rigid, hierarchical culture promotes the arguments of the super-ego, for this was the part of the mind that strove to please its lord or master. The drives of the id, in such an environment, become taboo, for they are seen as working against a well-ordered society. When the imperial world collapsed in the early twentieth century, the psychological hold that an emperor had over the population weakened. At this point, motivations other than obedience to social norms surface. This was the move from Crowley’s patriarchal Age of Osiris into his child-focused Age of Horus, and the shock that so affected the audience of The Rite of Spring. The result was that, for many, the ego swung away from the super-ego and found itself facing the long-ignored id.

Freud’s model of the id, ego and super-ego was originally only intended to describe individuals. But there is a tradition of using Freudian ideas to help illuminate larger changes in society, for example in works like Wilhelm Reich’s The Mass Psychology of Fascism (1933). Freud’s psychological models can be used alongside a sociological concept known as ‘mass society’, which describes how populations of isolated individuals can be manipulated by a small elite. The idea of the ‘mass media’ is related to that of the mass society. Methods of controlling or guiding mass society were of great interest to political leaders. An example of the subconscious manipulation of mass society was the twisting of people’s reaction to different ethnicities in the 1930s. When political leaders promoted hatred of others, it created one of those rare instances that appealed to both the id and the super-ego at the same time. It was possible to unleash the barbaric, destructive energies of the id while at the same time reassuring the super-ego that you were loyally obeying your masters. With the id and the super-ego in rare agreement, the ego could find it hard to resist the darkness that descended on society. When the wild energies of the id were manipulated in a precise way, leaders could command their troops to organise genocides.

Jung was not interested in the question of whether UFOs were ‘real’ or not. He wanted to know what their sudden appearance said about the late twentieth century. Mankind had always reported encounters with unexplained somethings, strange entities which, if they existed at all, were beyond our understanding. Jung understood that the interpretation of these encounters depended on the culture of the observer. Whether a witness reported meeting fairies, angels, demons or gods depended on which of those labels their culture found most plausible. The fact that people were now interpreting the ‘other’ in a new way suggested to Jung that a change had occurred in our collective unconscious. As recently as the First World War, we still told stories about encounters with angels. One example, popularised by the Welsh author Arthur Machen, involved angels protecting the British Expeditionary Force at the Battle of Mons. But by the Second World War, Christianity had collapsed to the point where meetings with angels were no longer credible, and none of the previous labels for otherworldly entities seemed believable. As a result, the strange encounters which still occurred were now interpreted as contact with visitors from other planets. The ideas of science fiction were the best metaphor we had to make sense of what we didn’t understand. UFOs, to Jung, were a projection of Cold War paranoia and the alien nature of our technological progress. He recognised that the phenomenon told us more about our own culture than it did about alien spaceships. As he wrote, ‘The projection-creating fantasy soars beyond the realm of earthly organizations and powers into the heavens, into interstellar space, where the rulers of human fate, the gods, once had their abode in the planets.’ We no longer considered the heavens to be the domain of loving gods and angels.

Anti-heroes with nothing to believe in became increasingly common as the 1940s rolled into the 1950s, as the certainty of purpose, which had characterised the war for the Allied nations, receded into memory. They were particularly prevalent in the works of the Beat writers, such as Scotland’s Alexander Trocchi.

Trocchi was a heroin addict, and what he captured so acutely in this novel was the morality of the junkie subculture. The junkie is separated from wider society by both the illegality of their drug and its isolating effects. Heroin users are not concerned with contributing to the greater good. They see society as something to be worked in order to fund their habits. That outsider lifestyle, separated from the anchors of work, respect and family, carries a heavy emotional cost, but that cost dissolves away under the effects of the drug. While on heroin, the problems of existence evaporate. The individual user becomes a self-contained unit, at peace with both themselves and their separation from others. The heroin subculture is, in many ways, the ultimate expression of individualism. You can understand why Aleister Crowley was so fond of the drug.

The scientific, artistic and industrial breakthroughs from between the wars had come so thick and fast that nobody had had time to come to terms with them. The political and moral crises caused by the emergence of fascism and communism had been a more pressing concern. It was not until peace arrived that people had time to take stock and re-evaluate where they were. What they found came as something of a shock. The great pre-First World War certainties were clearly no longer viable, but what, exactly, had replaced them? Uncertainty, paradox and irrationality were everywhere. Where was the postwar omphalos, the perspective from which everything made sense? Countless isolated individual perspectives all came to the realisation that there wasn’t one.

Sartre wrote that ‘Everything that exists is born for no reason, carries on living through weakness, and dies by accident.’ The core principle of existentialism is the recognition that life is meaningless, and that the experience of existing in the present moment is all that matters. The rest of Sartre’s philosophy was an attempt to come to terms with this. Sartre didn’t waste time arguing that there was no God. He understood that most thinking people of the time had already reached that conclusion themselves. At the point when Sartre’s novel found its audience, in the immediate aftermath of the Holocaust, there were few who would point at the world and declare that it was the work of a just and moral god. Sartre’s aim was to explore what it meant to be alive in a godless universe.

For Sartre and for Trocchi, denying the meaninglessness of existence was cowardice. It was necessary to fully confront our situation because, as Sartre saw it, ‘Life begins on the other side of despair.’ He believed that we were both blessed with free will yet cursed with the awareness of the pointlessness of it all, which meant that mankind was ‘condemned to be free’.

The passive navel-gazing of a nihilist was a self-fulfilling prophecy. If you engaged in it, then life did indeed appear meaningless. But that perspective, crucially, only described the individual nihilist and not mankind in general. Outside of the existentialist bubble, there was value and meaning to be found. It did not require intellectual gymnastics, or faith in God or Marx. It just required energy, dedication and a desire to get involved.

Unlike the more nihilistic European Beats, the American Beat Generation flavoured their version of existentialism with Eastern mysticism.

There are distinct differences in the definitions of words like satori, beatitude, enlightenment, grace, rapture, peak experience or flow, but these terms also have much in common. They all refer to a state of mind achievable in the here and now, rather than in a hypothetical future. They are all concerned with a loss of the ego and an awareness of a connection to something larger than the self. They all reveal the act of living to be self-evidently worthwhile. In this they stand in contrast to the current of individualism that coursed through the twentieth century, whose logical outcome was the isolation of the junkies and the nihilism of the existentialists. But interest in these states, and indeed experience of them, were not widespread. They were the products of the counterculture and obscure corners of academia, and hence were treated with suspicion, if not hostility. The desire for personal freedom, which individualism had stoked, was not going to go away, especially in a generation that had sacrificed so much in the fight against fascism. How could we maintain those freedoms, while avoiding the isolation and nihilism inherent in individualism? Reaching out towards satori or peak experience may have been one answer, but these states were frustratingly elusive and too difficult to achieve to provide a widespread solution.

Von Braun had already decided that he wanted to surrender to the Americans. His argument was that America was the only country not decimated by the war, and hence the only country financially able to support a space programme, but it is also clear that his aristocratic background would not have been well suited to life in the Soviet Union. America, in turn, wanted von Braun primarily because they didn’t want anybody else to have him. Arrangements were quickly made to bring von Braun to America, along with his designs, his rockets and about a thousand other Germans (members of his team, along with their family members). An operation to whitewash the files of von Braun and other prominent Nazis in the team began. Jack Parsons’s old mentor von Kármán was part of this process. It was known as Operation Paperclip after the paperclips which were used to attach fake biographies, listing false political affiliations and employment histories, to their files. Following Operation Paperclip, even Nazis who were guilty of war crimes were eligible for life in the US.

LSD caused the user to see themselves not just as a self-contained and isolated individual entity, but as an integral part of something bigger. But explaining what this bigger thing was proved problematic, and resulted in the hippies talking vaguely of ‘connection’ and of how ‘everything was one’. In this they were similar to the modernists, attempting to find a language with which to communicate a new, wider perspective. The hippies turned to Eastern religions, which they learnt about from American Beats like Allen Ginsberg and English writers like Aldous Huxley and Alan Watts. They tried describing their experiences in co-opted Buddhist and Hindu terms, but none of those ancient metaphors were entirely satisfactory in the modern technological age. As vague and simplistic as it may have sounded, it was simpler to fall back on the most universal non-individualist emotion to describe their experience: love. It was for this reason that the 1967 flowering of the psychedelic culture became known as the Summer of Love. The emotion of love is an act of personal identification with an external other, when the awareness of that person is so overwhelming that any illusion of separation between the two collapses. There is a reason why the biblical term for physical love was ‘to know’ someone. As such it is distinctly different from the isolating individualism so dominant in the rest of the century. What it is not, however, is an easily extendable organisational principle that can readily be applied to society as a whole. Christianity had done its best to promote love during the previous two centuries. The Church ordered its followers to love, through commandments such as ‘love thy neighbour’, as if this was reasonable or possible. But ordering people to love was about as realistic as ordering people not to love. Love just doesn’t work that way, and it doesn’t inspire the confidence in the Church that it seemed to think it did. It is noticeable that the more individualistic strains of American Christianity, which bucked the global trend of declining congregations, put less emphasis on that faith’s original teachings about love and social justice. The love culture of the hippies was brought low by the ego-fuelling cocaine culture of the 1970s and 80s. Attempts at describing a non-individualistic perspective were dismissed for being drug-induced, and therefore false. The hippies’ stumbling attempts to describe their new awareness had been too vague and insubstantial to survive these attacks and they were written off as embarrassing failures by the punks. Yet slowly, over the decades that followed, many of their ideas seeped into the cultural mainstream.

One way to understand the twentieth century’s embrace of individualism is to raise a child and wait until he or she becomes a teenager. A younger child accepts their place in the family hierarchy, but as soon as they become a teenager their attention shrinks from the wider group and focuses on themselves. Every incident or conversation becomes filtered through the ever-present analysis of ‘What about me?’ Even the most loving and caring child will exhibit thoughtlessness and self-obsession. The concerns of others become minor factors in their thinking, and attempts to highlight this are dismissed by the catch-all argument, ‘It’s not fair.’ There is a neurological basis for this change. Neuroscientists report that adolescents are more self-aware and self-reflective than prepubescent children. Aleister Crowley may have been on to something when he declared that the patriarchal age was ending and that the ‘third aeon’ we were entering would be the age of the ‘crowned and conquering child’. The growth of individualism in the twentieth century was strikingly similar to the teenage perspective. The teenage behavioural shift should not be seen as simple rudeness or rebellion. The child’s adult identity is forged during this adolescent stage, and the initial retreat inwards appears to be an important part of the process. But something about the culture of the mid- to late twentieth century chimed with this process in a way that hadn’t happened previously. In part this was the result of demographics, as the postwar baby boom meant that there was a lot more of this generation than usual. Adolescents were foregrounded and, for the first time, named. ‘Teenager’ is a word that was first coined in the 1940s. Like the words ‘genocide’ or ‘racism’, it is surprising to find that this term did not exist earlier.

Thatcher’s focus on the primacy of the individual as the foundation of her thinking was perfectly in step with the youth movements of her time. The main difference between Thatcher and the young was that she justified her philosophy by stressing the importance of responsibility. At first this appears to mark a clear gulf between her and the consequence-free individualism of The Rolling Stones. But Thatcher was only talking about individual personal responsibility, not responsibility for others. Personal responsibility is about not needing help from anyone else, so is essentially the philosophy of individualism restated in slightly worthier clothes. This highlights the schizoid dichotomy at the heart of the British counterculture. It viewed itself as being stridently anti-Thatcher. It was appalled by what it saw as a hate-filled madwoman exercising power without compassion for others. It argued for a more Beatles-esque world, one where an individual’s connection to something larger was recognised. Yet it also promoted a Stones-like glorification of individualism, which helped to push British society in a more Thatcherite direction. So entrenched did this outlook become that all subsequent prime ministers to date – Major, Blair, Brown and Cameron – have been Thatcherite in their policies, if not always in their words.

Members of youth movements may have regarded themselves as rebels and revolutionaries, but they were no threat to the capitalist system. It did not matter if they rejected shoes for Adidas, or suits for punk bondage trousers, or shirts for Iron Maiden T-shirts. Capitalism was entirely untroubled over whether someone wished to buy a Barry Manilow record or a Sex Pistols album. It was more than happy to sell organic food, spiritual travel destinations and Che Guevara posters to even the staunchest anti-capitalist. Any conflict between the counterculture and the establishment occurred on the cultural level only; it did not get in the way of business. The counterculture has always been entrepreneurial. The desire of people to define themselves through newer and cooler cultures, and the fear of being seen as uncool or out of date, helped fuel the growth of disposable consumerism. The counterculture may have claimed that it was a reaction to the evils of a consumerist society, but promoting the importance of defining individual identity through the new and the cool only intensified consumer culture.

Through the insights of people like Lorenz and Mandelbrot, and the arrival of brute computational power, a major shift occurred in our understanding of both mathematics and nature. As the research accumulated, two surprising facts became apparent. When you looked closely at what appeared to be order, you would find outbreaks of chaos around the edges, threatening to break free. And yet when you looked deeply into chaos, you would find the rhythms and patterns of order. The discovery of order in chaos was of intense interest to biologists. The existence of complex life never really appeared to be in the spirit of the second law of thermodynamics, which stated that, in an isolated system, entropy would increase. Ordered things must fall apart, in other words, so it was odd that evolution kept generating increasingly intricate order. With the arrival of chaos mathematics, biologists now had a key that helped them study the way order spontaneously arose in nature. It helped them understand the natural rhythms of life, both in the internal biology of individual animals and in the larger ecosystems around them.

By the 1880s American courts, perhaps for the sake of a quiet life, routinely confirmed that corporations were individuals. They were, therefore, welcome to all the freedom and protection that the Fourteenth Amendment granted. It was a neat legal solution, and one which gave corporations the right to borrow money, buy and sell property, and take each other to court. But a corporation, plainly, wasn’t an individual. An individual can be sent to jail for breaking the law. A corporation cannot, and the threat of imprisonment is an important limiting factor in the behaviour of individuals. Another difference is that people eventually die. A corporation could theoretically be immortal. Death is a necessary part of natural ecosystems, in part as a means of stopping things from growing too large. Corporate shareholders had only limited liability for what the corporation did. They couldn’t lose more than the value of their stock. This system made shares easy to trade and increased participation in stock markets, because the credit-worthiness of the stockholder was no longer a concern that the market had to take on board. Corporations, it seemed, had been granted the great dream of the twentieth century: individualism without responsibility. But corporations did have a responsibility, and this strongly bound their actions. They were required by law to put their shareholders’ interests above everything else, and that included the interests of their employees, their customers and the public good. Those shareholders’ interests were almost entirely concerned with making money, so those corporations were legally bound to pursue maximum profit. Corporations had no choice but to become undying, unjailable profit-taking machines. And as the wealth of corporations grew, financial quarter after financial quarter, so too did their power. Corporations started to catch up with, and then overtake, nation states.

The reason why corporations were able to achieve such remarkable growth was, in part, due to externalities. An externality is the financial impact of an action which is not borne by the organisation that caused it. Externalities can be both positive and negative, but those created by organisations looking to reduce costs and increase profits tend to be negative. If a factory took so much water from a river that farmers downstream were no longer able to irrigate their land, then the farmers’ financial losses would not be on the factory’s balance sheet. Those losses, according to the factory’s accounts, were an irrelevant externality. Other examples of externalities include noise pollution, systemic risk to the banking system caused by irresponsible speculation, and the emission of greenhouse gases. An externality, in other words, is the gap between accountants and the real world. It is the severing of an important feedback loop, which could otherwise keep a complex system sustainable, ordered and self-stabilising. One way to deal with the problem of externalities is through environmental laws and industrial regulation. But as the constant stream of corporate malfeasance cases shows, it is often more profitable to ignore these constraints and pay a fine should they get caught.

The strange attractor-like shift that occurred in the years leading up to 1980 was not apparent at the time. Economic growth continued as expected, but its impact on society began to change. The Princeton economist Paul Krugman calls the shift in American inequality that began at the end of the 1970s the ‘Great Divergence’, while the New Yorker journalist George Packer refers to the years after 1978 as the ‘unwinding’. The benefits of economic growth slowly removed themselves from the broader middle class, and headed instead for the pockets of the very richest. Good jobs evaporated, social mobility declined and the ‘millennial’ generation born after 1980 are expected to be worse off than the postwar ‘baby boomers’. Life expectancy rates have started to fall, in certain demographic groups at least. At the same time, inequality has increased to the point when, in 2015, the combined wealth of the eighty richest people in the world was equal to that of the poorest half of the world’s population, some 3.5 billion people. Even those who believe that those eighty people work hard for their money would have difficulty arguing that they work 45 million times harder than everyone else.

This retreat of the American Dream, which had promised a future better than the past, is the result of a number of complicated and chaotically linked events from the 1970s. One of these was the rise of Deng Xiaoping to the position of Paramount Leader of the Chinese Communist Party in December 1978, in the aftermath of the death of Mao. Deng began the process of introducing a managed form of capitalism into China. The impacts of this would not be felt immediately, but the availability of cheaper Chinese labour for Western corporations would lead to the disappearance of well-paid Western manufacturing jobs, as well as destabilising trade imbalances. This process of globalisation also led to the disappearance of corporate taxes from government balance sheets. Corporations increasingly embraced globalisation and reimagined themselves as stateless entities in no way beholden to the nations that formed them.

A second factor was the collapse of the Bretton Woods Agreement in August 1971. This was a framework for international monetary management that had been agreed in the small New Hampshire town of Bretton Woods towards the end of the Second World War. The pre-war, every-man-for-himself approach to currency valuation had been responsible for some of the instability that led to war, so Bretton Woods was an attempt to create a more stable environment for international finance. It tied the value of international currencies to the US dollar, which was in turn tied to the value of gold reserves. This link between the dollar and gold was important. When the Bank of England had first issued banknotes in the currency of ‘pounds sterling’ in the eighteenth century, this meant that the note itself was a convenient stand-in for a certain weight of sterling silver. The paper itself, therefore, was really worth something. For this reason, currencies have historically been backed by something that was both physical and in limited supply, such as gold or silver. This link between money supply and actual physical wealth created confidence in a currency, but it was a constant frustration for those who dreamt of perpetual, continuous economic growth. The gold standard, as the link between paper money and precious metals was known, tended to produce periods of growth interspersed with periods of deflation and even depression. This may have been a natural and sustainable system, but it was not the sort of thing that democratic societies voted for. President Nixon’s response to a period of economic pain was to end the gold standard, cut the link between the dollar and physical wealth, and ultimately bring Bretton Woods to an end. The value of the dollar could then float free, worth whatever the markets said it was worth. Thanks to this neat trick of divorcing money from physical reality, the perpetual-growth economy continued in what would otherwise have been a period of recession. It also transpired that the ever-increasing amount of consumption needed for perpetual growth could be financed by creative bookkeeping, and the creation of debt. Debt mountains began to grow. When that approach ran into difficulty, in the early twenty-first century, taxpayer-funded bailouts kept the dream of perpetual growth alive.

The intellectual justifications for the policies that led to the Great Divergence are commonly referred to by the term neoliberalism. Neoliberalism was a school of economic thought that dated back to the 1930s, but it only became the orthodox belief system for politicians and corporations following the election of Margaret Thatcher as the British prime minister in 1979 and the arrival of the economist Paul Volcker as Chairman of the US Federal Reserve in 1979. Neoliberalism, at its heart, argued that the state was just too dumb to be left in charge of people’s wellbeing. It just didn’t understand the nature of people in the way that the markets understood them. It had only a fraction of the information needed to make good decisions, and it was too slow, inept and politically motivated to put even that knowledge to good use. As the neoliberalists saw it, the role of the state was to put in place a legal system that protected property rights and allowed for free trade and free markets, and to protect this system by military and police forces. State-owned industries needed to be placed in private ownership and run for profit. At that point the state had to step away and not interfere. Private enterprise would take care of the rest. Neoliberalism was always going to create inequality, but it claimed that this was for the greater good. If a small elite became immensely wealthy, then this wealth would ‘trickle down’ to the rest of society. Or in a phrase which came to symbolise the 1980s, ‘greed is good.’ Wealth did not trickle down, needless to say. It passed upwards from the middle class to the very top. Few economists now take the idea of the trickle-down effect seriously, but the thinking behind it still colours much of the discussion about global economics. It is still common to hear the very rich described as ‘wealth creators’, rather than the more accurate ‘wealth accumulators.’

What mattered was that each element was fun in itself. This is probably the most recognisable aspect of postmodernism, a collision of unrelated forms that are put together and expected to work on their own terms. The idea that an outside opinion or authority can declare that some elements belong together while others do not has been firmly rejected. A related aspect of postmodernism is what theorists call jouissance. Jouissance refers to a sense of playfulness. The French word is used over its closest English translation, ‘enjoyment,’ because it has a more transgressive and sexualised edge that the English word lacks. Postmodern art is delighted, rather than ashamed, by the fact that it has thrown together a bunch of disparate unconnected elements. It takes genuine pleasure in the fact that it has done something that it is not supposed to do.

Postmodernists have firmly internalised Duchamp’s insight that when different people read a book or watch a movie, they perceive it differently. There are many interpretations of a work, and it cannot justifiably be argued that one particular perspective is the ‘true’ one, even when that perspective is the author’s. People can find value in a work by interpreting it in a way that the author had never thought of.

The problem was that there was no mechanism inside postmodernism for weeding out the meaningless from the meaningful. As a result it became possible to build an academic career by sounding clever, rather than being clever.

Perhaps academia and postmodernism are fundamentally incompatible. Postmodernism denied that there was an external framework which could validate its works, yet that’s exactly what academia was: a system to categorise and understand knowledge in relation to a rigid external framework. Postmodernism’s rejection of external frameworks suggested that there were flaws in the foundation of academia. This embarrassed academia in the same way that Gödel’s Incompleteness Theorem embarrassed logically minded mathematicians. In these circumstances, the speed with which the orthodoxy became an insult is understandable.

The New Age movement was not noted for its firm foundations and, like postmodernism itself, it is routinely mocked for this reason. Yet to dismiss a spiritual movement on the rational grounds of factual inaccuracy is, in many ways, to miss the point. Religions and spirituality are maps of our emotional territory, not our intellect. The Christian faith, for example, uses a crucifix as its key symbol. Crucifixion was one of the most awful forms of torture in the Iron Age world, and the icon of the cross represents unimaginable suffering. The sight of the cross is intended to generate an emotional understanding of that suffering, rather than an intellectual one. Just as a joke is valid if it is funny, even though it is not true, so spiritual symbols succeed if they have emotional or psychological value, regardless of the accuracy of stories that surround them. To look at the symbol of the crucifix and question whether the events it represents really happened does miss the point. The appearance of the New Age movement illuminates how the great perspective shift of the early twentieth century affected our emotional selves. The fact that it happened at all is remarkable in itself, since unforced, wide-scale spiritual shifts affecting a sizeable proportion of the population are historically rare. The New Age was a rejection of hierarchical spirituality focused on the worship of a lord, and instead promoted the individual self to the role of spiritual authority. This produced almost exactly the same results in the spiritual world that postmodernism produced in the cultural world. Many varied and contradictory viewpoints were declared, leading to a highly personalised mishmash of world religions and spiritual practices. It welcomed astrology, Daoism, shamanism, tarot, yoga, angels, environmentalism, Kabbalah, the Human Potential Movement, ancient wisdom traditions and many more. Anything that shed light on the dual role of the practitioner as both a self-contained individual, and also part of the wider whole, was on the table. New Agers were free to take what they wanted for their personal practice, and to reject the rest. They became spiritual consumers in a marketplace of traditions. For those who kept faith in the existence of absolute certainty, it was all incredibly annoying.

We might curse relativity, and crave an absolute position. But that doesn’t change the fact that we do not have one. Postmodernism was not some regrettable intellectual folly, but an accurate summation of what we had learnt. Even if it was intellectual quicksand from which escape did not appear possible. And which nobody seemed to like very much.

The individualism that had fuelled the neoliberal triumph was not the end point to humanity that people like Fukuyama or Margaret Thatcher once believed it to be. It was a liminal period. The twentieth century had been the era after one major system had ended but before the next had begun. Like all liminal periods it was a wild time of violence, freedom and confusion, because in the period after the end of the old rules and before the start of the new, anything goes.

Imagine that the people of this planet were points of light, like the stars in the night sky. Before the twentieth century we projected a constricting system onto these points, linking them in a hierarchical structure beneath a lord or emperor. This system informed our sense of identity, and governed how we orientated ourselves. It lasted for thousands of years. It may have been unfair and unjust, but it was stable. At the start of the twentieth century that system shattered and those points were released to become free-floating, all with different perspectives. This was the postmodern relative world of individuals where, as Crowley put it, ‘Every man and every woman is a star.’ Here was the twentieth century in all its chaotic glory, disjointed and discordant but wild and liberating. But then a new system imposed itself on those free-floating points of light, just when we were least expecting it. Digital technology linked each of those points to, potentially, every other. Feedback loops were created, and consequences were felt. We could no longer be explained simply in terms of individuals. Now factors such as how connected we were, and how respected and influential we might be, were necessary to explain what was going on. Money is, and has always been, important. But the idea that it was the only important thing was an oddity of the twentieth century. There had always been other social systems in place, such as chivalry, duty or honour, which could exert pressures that money alone could not. That has become the case again. Money is now just one factor that our skills and actions generate, along with connections, affection, influence and reputation. Like the New Agers who saw themselves as individuals and, simultaneously, an integral part of a larger whole, we began to understand that what we were connected to was an important part of ourselves. It affected our ability to achieve our goals. A person that is connected to thousands of people can do things that a lone individual cannot. This generation can appear isolated as they walk the streets lost in the bubble created by their personal earphones. But they can organise into flashmobs in a way they never could before.

In Sartre’s 1943 philosophical book Being and Nothingness there is a short section entitled ‘The Look’. Here Sartre attempts to deal with the philosophical nature of ‘the other’ by imagining an example which owes much to the multiple-perspective relativity of Einstein. Sartre talked about looking through a keyhole at another person, who is unaware that they are being observed, and about becoming completely absorbed in that observation. Suddenly the watcher realised that a third person had entered the room behind them, and that they themselves were being observed. Sartre’s main point was concerned with the nature of objectification, but what is striking is how he described the awareness of being observed. For Sartre, being observed produces shame. Compare this to the digital generation. Watch, for example, footage of the audience at a music festival, and note the reaction of the crowd when they suddenly realise that they are being looked at by television cameras. This is always a moment of delight and great joy. There is none of the shame that Sartre associated with being observed, and neither is that shame apparent in that generation’s love of social media and seeming disregard for online privacy. Something has changed, therefore, in the sense of self which our culture instils in us, between Sartre’s time and the present.

In alchemy, there is a process known as solve et coagula. Solve means the process of reductionism or analysis; the reducing of a substance to its indivisible components. In the I Ching this is represented by hexagram 23, ‘Splitting Apart.’ This is equivalent to taking a pocket-watch to pieces in order to understand completely how it works, and is necessary before the equally vital second stage of the formula can commence. Coagula is the reassembly of the pocket-watch in a perfected, or at least improved, form. It is the equivalent of holism as opposed to reductionism, or the purposive process of synthesis as opposed to that of analysis. In cultural terms, the individualism of the twentieth century can be likened to the process of solve taken to its logical end. Everything was isolated, and in isolation it was understood. The network allows the process of coagula. Things are being reconnected, but with transparency and understanding that we did not have before the process of solve.