Which of the following best reflects a result of technological innovation during the 1980s?

  • View PDF

Which of the following best reflects a result of technological innovation during the 1980s?

Volume 47, Issue 9, November 2018, Pages 1554-1567

Which of the following best reflects a result of technological innovation during the 1980s?

https://doi.org/10.1016/j.respol.2018.08.011Get rights and content

Sustainable development goals

National systems of innovation

War made the state, and the state made war, but does this statement hold true today? Will it apply in the future? The consensus is that the absence of major war within the western world, post 1945, did cause the war–state relationship to change, but each became significantly less important to the other. This article argues that the relationship was closer and deeper than has been assumed. It proposes that the peculiar strategic conditions created by the nuclear age caused states to wage a ritualistic style of war, in which demonstration rather than the physical application of violence became increasingly important. Within this setting, the state drove the process of technological innovation in defence to its limits in an effort to demonstrate its military superiority. This massive peacetime investment in defence technology exerted a huge impact on the character of war, which led to new strategic forms. However, most importantly, the diffusion of military technology also affected the wider economy and society, leading to a form of internal power transition within states. The author speculates on how these elemental forces will play out in the future, what will happen to war and the state, and whether we will reach a point where war leads to the unmaking of the state.

This article explores the changing relationship between war and the state in the western world since the end of the Second World War. Specifically, it analyses how that relationship evolved during and after the Cold War, and extrapolates from current trends to speculate what impact war will have on the future evolution of the state. Our understanding of the connection between war and the state assumes that war played an instrumental role in the formation of the state in the early modern period. The synergistic relationship established at that time then blossomed over the next four centuries, during which both state and war grew exponentially. However, this expansion was checked by the declining incidence and scale of interstate war after 1945, which eventually allowed new political and economic priorities to emerge that resulted in the reshaping of, and a changed role for, the state.1

The article presents an alternative view of the war–state relationship in the post-Second World War era. It does not challenge the logic that the decline in war affected the war–state connection.2 However, it does not see this change as evidence of atrophy. Instead, it demonstrates how the complexity of war after 1945 led to a deep but more subtle interaction, which had a profound effect on war, the state and society in the western world. While I do not challenge the premise that a range of factors played a role in shaping the connection between war and the state, the precise interaction and relative importance of these forces have altered over time, and this has caused the demands of war on the state to shift in significant ways. In the period under scrutiny in this article, I argue that the role of technology in war increased dramatically because of the nuclear revolution. In this setting, technological development reduced the opportunities for war, but the arms race it generated also brought into being new technologies, and these facilitated new forms of conflict. These developments affected our understanding of war's character and its interaction with the state.

Military history provides a rich literature on war and technology, but its focus has tended to be on the importance of technology in helping militaries win wars.3 In rarer cases, writers have sought to situate war within a broader technological, economic, social and cultural framework.4 This is where the principal focus of the present article lies. However, my aim is to turn this domain upside down and explore not just how the world has changed (and continues to change) war, but how the war–technology dynamic has changed the world, in what might be described as a form of positive feedback. To this end, I expand and build on the historical overview presented by William McNeill and Maurice Pearton of the financial and technical linkages forged between war and the state starting in the late nineteenth century.5 This provides a conceptual framework within which to explore how that relationship evolved and how it might change in the future. Most importantly, this construct allows the contemporary war–state relationship to be viewed through a different lens, one that sees a stronger, darker and more damaging connection than is generally recognized.

In addressing this issue, I have relied on the experiences of the United States and United Kingdom, as representative examples of western states, to support the propositions set out here. Most importantly, in both cases the state played a leading role in promoting defence research after 1945; technology was of central importance in their strategic frameworks, and continues to be so today. Second, both states consciously exploited defence technology to promote wider economic prosperity. I recognize that attempts to look into the future carry a great deal of risk. I am aware of this risk and explain below how I have taken it into account. The only general point I would make here is that history also shows that, sometimes, military forecasting is successful. I have looked at these examples and drawn on their methodologies.

In sum, the central argument of this article is that, after 1945, technology acted as a vital agent of change in the war–state relationship, and eventually the ripples of this change spread throughout society. To illustrate this point, you have only to look at the ubiquitous smartphone and the genesis of technologies produced by defence research that made it possible. This capability has in turn affected the conduct of war; and this has affected the state. Thus the smartphone provides just one significant example of how technology and war are shaping the state and the world we live in.6

The article is divided into three parts. The first explores the war–state relationship and the factors that shaped it during the Cold War. It explains why technological innovation became so important in war, and how this imperative influenced both our understanding of war and the interaction between war and the state. The second section examines why the imperative for technological innovation persisted, and why the war–state infrastructure survived in the post-Cold War era. Finally, the third section explores how current trends might influence the war–state relationship in the future.

Reconceptualizing war: the rise of post-modern war, 1945–1989

Clausewitz missed the importance of technology as a variable in his analysis of war.7 Tilly, one of the most critical commentators on the war–state relationship, was also sceptical about the importance of technology in this process, and focused instead on the economics of waging war.8 The omission is understandable, because the history of war is characterized by long phases of technological stagnation punctuated by occasional spasms of revolutionary change caused by a variety of forces.9 This point is illustrated by a cursory glance at naval technology, which shows that ship design and armaments in Europe remained largely unchanged from 1560 to 1850.10 However, I contend that the importance of technology increased dramatically in the conduct of war from the nineteenth century onwards, for three reasons. The first was the impact of the Industrial Revolution. This period of sustained and rapid technological innovation eventually affected all areas of human activity, including war. Evidence of the increased pace in technological change can be seen from Schumpeter's economic analysis of capitalism and its relationship to technology. In his view, four long economic cycles in the Industrial Revolution led to ground-breaking changes in the mode of production in little more than a hundred years.11 At the microeconomic level, Schumpeter also challenged economic orthodoxy by arguing that capitalism was based not on price competitiveness but on innovation, via the creation of ‘the new commodity, the new technology, the new source of supply, the new type of organisation’. Schumpeter called this the process of ‘creative destruction’ as firms seek to innovate to achieve a position of monopoly and thereby maximize profits until that advantage is cancelled out by the next innovation.12

During this time, the technological needs of the armed forces ‘were met out of the same scientific and technical knowledge that manufacturing industry had put to use in satisfying its commercial needs’.13 As such, wider forces fed into the realm of war. However, this situation slowly changed such that the demands for military technology eventually shaped the wider context in which it existed—which brings us to the second reason why the importance of technology increased. O'Neill demonstrates how the state began to assume a role as a sponsor of technological innovation in defence in the late nineteenth century as the military became increasingly interested in the exploitation of technology. Such state sponsorship of innovation was termed ‘command technology’.14 However, as Hartcup observed, this process of innovation operated within military, fiscal and time constraints that imposed a limit on the ambition of defence research.15 In general, mass industrialized war in the twentieth century emphasized quantity more than quality, and required the mobilization of society and the economy via the state. The demands of war also resulted in the state expanding into the provision of education and health care to ensure the population was fit to wage war. Even liberal Britain succumbed to this view of the state.16 These features eventually became the defining characteristics of what Hables Gray called ‘modern war’.17

The advent of the nuclear age precipitated a profound change in the organization and conduct of war. Hables Gray asserts that 1945 marks the dividing line between modern war and the birth of what he terms post-modern war.18 This philosophical construct is used as intended by post-modernism, not as a label, but as a way of indicating that war, like many forms of human activity, is a discourse.19 That discourse changed profoundly after 1945 because at that point scientific advance, in the form of nuclear weapons, made modern war impossible. This new strategic setting precipitated what Holsti described as the diversification of warfare; and this in turn resulted in a blurring of the line between peace and war as governments employed a range of means to achieve their policy goals below the threshold of general war. Most importantly, the forms of war proliferated as new ways were devised to employ war as a political tool in a nuclear world.20 This change did not render Clausewitz's concept of war obsolete, but it did require it to be adapted.21

Clausewitz explained that ‘war is an act of violence to compel our opponent to fulfil our will’.22 War is also the continuation of policy by other means.23 War, then, is defined as a discourse of physical violence to achieve a political goal. However, in examining the post-1945 war–state relationship in the West, we need to revise our understanding of war so that it extends beyond physical violence and bloodshed. Russian military reflections on the Cold War reveal an interesting narrative that reinforces this expansion of war beyond its traditional domain. According to this analysis, the Soviet Union lost the Cold War because it was defeated by non-military means employed by its enemy that focused on psychological, political, information, social and economic attacks against the Soviet state.24 Although this interpretation can be contested, it is important to acknowledge that states used both military and non-military levers to confront their enemies in this conflict. Technology played a vital role in facilitating this process, for example via the communications revolution, which facilitated the waging of activities such as political warfare. However, the most salient aspect of the Cold War was the discourse of deterrence. Within this context, the rituals of war in terms of organizing, preparing and demonstrating an ability to fight nuclear war in the hope of deterring potential opponents and thereby preventing the possibility of war became substitutes for organized violence. Small wars happened on the periphery of the US and Soviet geopolitical space, but in the core region, a different kind of cognitive and cultural violence emerged, which can be seen as a form of war.25

How, then, did technology fit into this new discourse of war? According to Buzan, because nuclear deterrence relied on anticipated weapons performance, it became sensitive to technical innovation, which meant the state had to respond to technological change by investing in defence research to maintain the credibility of its deterrent.26 As a result, a premium came to be placed on technological innovation in defence, and this caused the role of the state in military research to expand.27 Consequently, states came to play an essential part in a military version of Schumpeter's process of creative destruction, albeit in the realm of defence. The role of the state was vital because it was the state that provided the critical financial resources required to take embryonic technologies and develop them at a speed unlikely to be matched by the civilian market. This facilitated a profound change in the relationship between the state and private industry and undermined the operation of the free market as governments opted to support defence contractors capable of conducting large and complex forms of research and development (R&D).28 This trend did not go unnoticed; in 1961, President Dwight Eisenhower warned against the pernicious influence exerted by the creation of a military–industrial complex (MIC), a construct which referred to the incestuous relationship between the military, defence industries and politicians acting in concert as an interest group to persuade the state to spend more on defence.29 Harold Laswell also noted the rising prominence of the military in peacetime in his thesis of the ‘garrison state’, which described the potential militarization of the American polity.30 Samuel Huntington echoed this concern in his book The soldier and the state, which considered how the United States could manage an immense military establishment in a time of peace without jeopardizing the sanctity of its democracy.31 These debates and themes waxed and waned as the Cold War progressed, but they persisted, and even in the 1980s the notion of the MIC was still being discussed.32 The strategic logic of nuclear deterrence created a climate which justified high defence spending and significant investment in defence research—but why did this infrastructure persist in the more benign environment of the post-Cold War world?

The persistence of post-modern war after the Cold War

The end of the Cold War resulted in a significant fall in defence expenditure. Equally importantly, the state reduced its participation in sustaining defence research and allowed the private sector to play a more prominent role in defence production. In the UK, where the nationalized defence industries had already been privatized in the 1980s, this process was extended to include the sale of the state's defence research and development arm. This change in industrial and technological policy reflected a broader adjustment as the state lost its position in the vanguard of the technological revolution. Since the start of the Cold War, US government-funded defence research had given rise to technologies such as the internet, virtual reality, jet travel, data joining, closed-circuit TV, global positioning, rocketry, remote control, microwaves, radar, global positioning, networked computers, wireless communications and satellite surveillance.33 The subsequent exploitation of these technologies by the private sector reflected a conscious policy choice by most western governments, which was to promote technology spinoffs from defence research into the wider economy as a way of generating wealth creation.34 Once the technology had been created, the civil, commercial sector proved adept at adapting and changing the new capabilities. The critical difference between innovation in the defence market and its civilian counterpart was that, in the latter, high rates of consumption led to product and process innovation by companies. As a result, civil technology providers increasingly took the lead in the information revolution. Given this new dynamism, military power relied increasingly on the existing pool of technological knowledge within the broader economy. The increasing emphasis on quality in war also generated greater complexity during operations. This trend facilitated the rise of private military companies in the post-Cold War era and resulted in western states increasingly subcontracting the provision of internal and external security to the private sector.35

However, in spite of the end of the Cold War, western governments continued to have an appetite for technological innovation and its integration into ever more complex weapons. Indeed, an important feature of post-modern war was that machines assumed an unprecedented importance in the post-Cold War era. As Hables Gray explained: ‘War is a discourse system, but each type of war has different rules of discourse. In postmodern war, the central role of human bodies in war is being eclipsed rhetorically by the growing importance of machines.’36

The First Gulf War was an important marker because it revealed to western society the power of technology, at least in a conventional war. As Freedman observed, this conflict resolved the high-tech versus low-tech debate which had persisted throughout the Cold War.37 Observers now spoke of a paradigm shift in the conduct of war and a revolution in military affairs (RMA) caused by technological advance in computers and communications.38 Paradoxically, cuts in defence spending and provision compounded the drive to rely on technology in war as smaller militaries sought to pack a bigger punch to compensate for their lack of mass.39 In the 1990s, the RMA served another purpose in that it allowed for the creation of what Shaw described as ‘risk-free’ war. Technology allowed western states to engage targets at long range with high accuracy, but at no risk to those firing the weapons—something that became very useful in an era of wars of choice.40 Perhaps the best example of the strengths and weaknesses of this approach was NATO's 78-day bombing campaign against Serbia in 1999.41

Technological innovation in the techniques of war allowed the state to continue using force as an instrument of policy, especially in those instances where there was no clear political consensus on taking military action. In sum, the state continued to see its security through the prism of technological advance; and this, in turn, helped to sustain the MIC in that brief period between the end of the Cold War and the start of the ‘war on terror’. The idea of an MIC persists today. For example, David Keen points to the powerful economic functions fulfilled by the war on terror, which he believed explained the persistence of a war based on counterproductive strategy and tactics.42 More recently, Paul Rogers has referred to the creation of a military–industrial academic–bureaucratic complex, which is exploiting the latest iteration of the war on terror: the war against the so-called ‘Islamic State in Iraq and Syria’ (ISIS).43 While the technology paradigm was briefly challenged in Iraq in 2006 and replaced by a more labour-intensive approach to war, as articulated in the principles of counter-insurgency, this, in turn, was quickly replaced by less risky, more capital-intensive techniques of war waged with satellites, robots, drones, precision weaponry and special forces.44 In summary, the elaborate infrastructure of war created during the Cold War endured in the post-Cold War era before being reinvigorated by the fiscal stimulus generated by the war on terror. During this period technology was viewed almost as a silver bullet. As such, it provided a neat answer to complex questions posed by the human and physical terrain of war. Most importantly, for a brief moment at least, it allowed western states to reimagine decisive victories and tidy peace settlements.45 Such was the allure of technology that Coker speculated on the possibility of a future ‘post-human warfare’ in which machines replaced humanity on the battlefield.46

Post-modern war and the future of the state

How, then, will predicted developments in technology shape the future of war and the state? This is a question that is causing much anxiety in both academic and policy-making circles. As Freedman points out, the future is based on decisions that have yet to be made in circumstances that remain unclear to those looking into a crystal ball.47 Just as important as this uncertainty are those biases that shape our preferences regarding how we see the future. Cohen has pointed out that debates on the future of war often suffer from being technologically sanitized, ignoring politics and therefore lacking a meaningful context.48 As a result, the ‘future war’ literature often suffers from an overreliance on a simplistic overview of decisive military technologies. I address these problems in two ways.

The first is to follow the advice offered by the sociologist Michael Mann, who observed that no one could accurately predict the future of large-scale power structures like the state; the most one can do is provide alternative scenarios of what might happen given different conditions, and in some cases to arrange them in order of probability.49 The UK's Concepts and Doctrine Centre adopted this approach and set out multiple scenarios to support its analysis of future strategic trends.50 Second, it is essential to widen the lens through which the future is projected and to understand the political context within which technology, war and the state will all be situated. To this end, I adopt here the Clausewitzian framework of analysis which Colin Gray employed in considering future war. As he explains:

Future warfare can be approached in the light of the vital distinction drawn by Clausewitz, between war's ‘grammar’ and its policy ‘logic’. Both avenues must be travelled here. Future warfare viewed as grammar requires us to probe probable and possible developments in military science, with reference to how war actually could be waged. From the perspective of policy logic we need to explore official motivations to fight.51

In exploring the future relationship between war and the state, and the role played by technology, two possible visions are presented here. The first explores the continuation of the status quo and represents the default setting of both the UK and US governments with regard to the future. The second follows the recommendation offered by Paul Davis, who advised when selecting a scenario to choose a vision that challenges and provokes controversy and that breaks out of orthodox thinking.52

Both models have one thing in common: they will be influenced by what might be seen as the next wave of technological change. This latest technical convulsion is illustrated by Schwab's idea of the fourth Industrial Revolution, which is a crude facsimile of Schumpeter's theory of long economic cycles. The fourth Industrial Revolution builds on the digital revolution, which began in the 1960s, but differs from it in that it entails ‘a much more ubiquitous and mobile internet, … smaller and more powerful sensors that have become cheaper, and … powerful artificial intelligence (AI) and machine learning’.53 The term ‘artificial intelligence’ was first used by the American scientist John McCarthy in 1956. According to his definition, AI is merely the development of computer systems to perform tasks that generally need human intelligence, such as speech recognition, visual perception and decision-making. More recently, Max Tegmark has defined AI as a non-biological intelligence possessing the capability to accomplish any complex task at least as well as humans.54 Currently, the exponential rise of AI is being driven by three developments in the world of computing: smarter algorithms, a vast increase in computing power and an ability to process vast quantities of data.55 What this means is that humans are now being challenged by machines in the cognitive as well as the physical domains of work. Digital technologies that have computer hardware, software and networks at their core are not new, but represent a break with the third Industrial Revolution because of the level of sophistication and integration within and between them. These technologies are transforming societies and the global economy.

The fourth Industrial Revolution is not only about smart and connected machines and systems. It is linked with other areas of scientific innovation ranging from gene sequencing to nanotech and from renewables to computing. It is the fusion of these technologies and their interaction across the physical, digital and biological domains that make the fourth Industrial Revolution fundamentally different from previous epochs. Emerging technologies and broad-based innovations are diffusing much more quickly and more widely than their predecessors, which continue to unfold in some parts of the world. It took the spindle, the hallmark of the first Industrial Revolution, 120 years to spread outside Europe; by contrast, the internet permeated the globe in less than a decade.56 In sum, it is not one specific technology but the sheer number of technologies and the interaction between them that is creating change on such an unprecedented scale that Schwab believes it can be described as a revolution. What, then, does this mean for the relationship between war and the state?

The first model of the future adopts a ‘business as normal’ scenario. In this version of the future, the policy logic of war remains focused on the security of the state and concentrates on state-based threats. The principal causes of war can be identified in the anarchy of the international system.57 The state preserves its monopoly on the use of force because the barriers to entry into the weapons market remain high. In addition, the state continues to function effectively and to be able to extract the resources needed to maintain its legitimacy and territorial integrity. Within this context, the state still pursues the development of advanced technologies to defend against mostly state-based threats. In this scenario, future war is imagined as a symmetrical contest between conventional forces on an increasingly automated battlefield. Within this space, humans will be augmented and in some instances replaced by AI and robots contending with increasingly lethal forms of weaponry.58

In this vision of the future, the military's pursuit of the next technology follows a familiar pattern, and the risk and uncertainty involved continue to make state finance and policy support indispensable to defence research. The most recent example of this activity is the UK government's promise to share with British Aerospace the cost of funding the development of a technology demonstrator for the next generation of fighter aircraft. Named Tempest, this fighter can operate either as a manned or as an unmanned aircraft; it will rely on AI and employ directed energy weapons.59 A grander example of the status quo scenario is the American-led ‘Third Offset’ strategy, a programme designed to preserve America's military-technological superiority. At the core of the Third Offset is the intention to exploit advances in machine autonomy, AI, quantum computing and enhanced digital communications to improve the man–machine interface in the future battlespace.60 The United States is investing US$18 billion in the creation of these capabilities, even though it is not clear how feasible the development of technologies such as AI will be.61

It is important to note that non-western states are also pursuing these policies. The outstanding example here is China. Its economic model, which is based on state-sponsored capitalism, is enabling it to work in a close partnership with privately owned Chinese tech firms to achieve a broad-based technological self-sufficiency in both commerce and defence.62 Investment in research and development has grown by 20 per cent per year since 1999 to the point where China now spends US$233 billion per annum, a sum that accounts for 20 per cent of the world's research and development spending.63 Three technologies, it is claimed, matter most to China, and all three relate to its ability to control the internet. These are semiconductors, quantum computing and AI.64 In 2017, China accounted for 48 per cent of all AI venture funding, and the Beijing government aims to be the centre of global innovation in AI by 2030.65

In this scenario, then, the state can harvest and refine a range of new technologies generated by the private rather than the public sector in a manner that preserves its monopoly on the use of force. At the same time, that monopoly is reinforced because of the complexity of these capabilities and the challenges posed in their use on operations, which require well-trained and professional forces. Private military companies will persist, but their existence will rely on their ability to draw on this pool of trained personnel created by the state to populate their organizations, which means they will support, not challenge, the state's role as a provider of security.

In the second scenario of the future, the policy logic of war reflects a darker, dystopian image of the relationship between war and the state. In this setting, conflict is a product of desperation caused by scarcity, which is occurring on a global scale. Most importantly, the causes of war lie within states as well as between them. In this multifaceted crisis, technological change is weakening rather than strengthening the state and undermining its ability to cope with the tsunami of problems sweeping over it. The debate over this view of the future policy logic of war began in 1972 with the publication of a hugely controversial book called The limits to growth.66 This study explored the impact of population growth, industrialization, pollution, and resource and agricultural shortages on the global economic system. Its principal conclusion was that population growth would create an insatiable demand for goods, outstripping the finite resource base of the planet. Humanity's efforts to address this imbalance in demand and supply by increasing productivity would be self-defeating and cause a host of environmental problems. In spite of the passage of time since its first appearance, this book set out themes that are explicitly linked to the spectrum of security issues we face today.67 Moreover, a recent study conducted by Melbourne University in 2014 claimed that the world might still be moving along the trajectory mapped out in 1972, and that economic and environmental collapse could happen before 2070.68

There is a general assumption that the worst effects of these environmental trends will be for the most part experienced outside the western world. Even when western states are affected, it is assumed, rich countries will possess the financial means to weather this future storm. However, a recent report by Laybourn-Langton and colleagues challenges this simplistic assumption and points to the social and economic harm being caused globally by current forms of human-induced environmental change. These authors also demonstrate that no region of the world will be untouched by this phenomenon, and use the UK as a case-study to illustrate the point. In their view, the degradation of the environment will interact with existing political and economic trends to undermine the cohesion and internal stability of states across the globe.69 Interestingly, the report's analysis of the challenges facing governments has not been contested, although their proposed solutions in terms of radical economic reform have been strongly challenged by economists.70

Current trends suggest that a potential environmental crisis might run in parallel with a possible economic crisis. Ironically, the source of this predicament lies in potential problems generated by the fourth Industrial Revolution. Like the military, business is also fast approaching a time when machine intelligence can perform many of the functions hitherto carried out by humans in a range of occupations. As McAfee and Brynjolfson explain, innovation was hugely advantageous in those occupations which relied on physical labour, allowing new forms of economic activity and employment based on human cognitive abilities to develop.71 However, this cognitive comparative advantage is now under threat, as computer algorithms have reached a point where they can outperform humans in many jobs.72

As in the military domain, so in our economic and political affairs it is predicted that AI will precipitate a revolution. A PriceWaterhouseCooper report predicted that 38 per cent of all jobs in the United States are at high risk of automation by the early 2030s.73 Most of these are routine occupations such as those of forklift drivers, factory workers and cashiers in retail and other service industries. This depressing analysis is supported by the Bank of England's estimate that up to 15 million jobs are at risk in the UK from increasingly sophisticated robots, and that their loss will serve to widen the gap between rich and poor.74 Most worrying is the fact that, in the short term, the jobs most at risk are low-paid and low-skilled occupations, which are precisely the jobs the UK and US economies have been so successful in generating to create record levels of employment since the financial crash in 2008.

As in the past, those most affected by this change will be the economically least powerful sectors of society—the old, and unskilled and unorganized labour. Until now, the managerial and professional classes have been able to use their economic and political positions to protect themselves from the worst effects of such crises.75 The big difference about this revolution is that AI is threatening traditional professional middle-class occupations. Any job that can be done via the application of pattern-searching algorithms will be vulnerable. This includes banking and finance, the law and even education. Daniela Russ has argued that humans need the personal touch in their day-to-day lives and that humans are therefore guaranteed to have a place in the job market.76 Sadly, Harari challenges even this view, and claims machines can mimic empathy by monitoring blood pressure and other physical indicators in interactions between AI and humans.77 A recent report by the Wall Street Journal supports this view. In their investigation of the use of AI in the provision of psychological therapy, they found people preferred the treatment offered by the AI precisely because it was a machine and so they did not feel judged. The system can also be configured to fit people's preferences, creating a 3D computer-generated image that is comforting and reassuring.78

A significant limitation of AI and machine technology is that currently they cannot replicate the dexterity of humans in handling delicate objects, and this does leave a role for humans in the workplace. However, scientists in California are looking at the use of AI and machine technology as a way of addressing the acute labour shortages experienced in the fruit-picking industry; this includes the development of machines capable of deciding which fruit is ripe for picking, and doing so in a way that does not damage the produce during picking, processing or distribution. Given these developments, Harari's prediction for humans in the workplace is bleak. ‘In the twenty-first century we might witness the creation of a massive new unworking class: people devoid of any economic, political or even artistic value, who contribute nothing to the prosperity, power and glory of society.’79 The mass unemployment generated would be on an unprecedented scale and likely to precipitate instability and violence.80

Further evidence to support the depressing scenario depicted here is provided by the former head of Google China, Dr Kai-Fu Lee, a man with decades of experience in the world of AI. In his view, AI ‘will wipe out billions of jobs up and down the economic ladder’.81 A typical counter to this view is that AI will lead to the creation of new jobs and new careers; but, as Tegmark explains, the evidence does not support this claim. If we look back over the last century, what is clear is that ‘the vast majority of today's occupations predate the computer revolution. Most importantly, the new jobs created by computers did not generate a massive number of jobs.’82

What then are the political and security implications of this profound economic change in terms of war and the state? Although depressing, the scenario depicted above does not mean we are condemned to what Martin Wolf describes as a kind of ‘technological feudalism’.83 As Gurr points out, past economic crises have provided political incentives for social reforms: for example, the New Deal in the United States, which represented a revolutionary change in how central government sought to manage the economy.84

According to Wolf, three factors might determine how well the state deals with these challenges: first, the speed and severity of the transformation we are about to experience; second, whether the problem is temporary or likely to endure; and third, whether the resources are available to the state to mitigate the worst effects of these changes. In the past, western governments have deployed a range of policies to deal with recessions or, as in the 1970s, scarcity of resources such as oil. However, these macroeconomic policy responses operated on the assumption that such crises were temporary, and that economic growth would resume and normality be restored quickly if the right measures were in place. In contrast, the environmental crisis and the AI revolution are happening rapidly and both will be enduring features of economic and political life. In Wolf's view, this latest revolution will require a radical change in our attitude towards work and leisure, with the emphasis on the latter. He also believes we will need to redistribute wealth on a large scale. In the absence of work, the government might resort to providing a basic income for every adult, together with funds for education and training. The revenue to fund such a scheme could come from tax increases on pollution and other socially negative behaviours. In addition, intellectual property, which will become an important source of wealth, could also be taxed.85

However, the introduction of these measures will not necessarily prevent a rise in politically motivated violence. As Gurr explains, recourse to political violence is caused primarily not by poverty but by relative deprivation. This is defined as ‘actors’ perception of discrepancy between their value expectations and their environment's apparent value capabilities'.86 As such, it reflects the difference between what people believe they are legitimately entitled to and what they achieve, perceptions of which have become acute in the age of the smartphone. Relative deprivation applies to both the individual and the group. Seen in this light, the bright, shiny new world created by AI provides a potentially rich environment for relative deprivation—particularly if large swathes of the middle classes are frustrated in their ambitions and suffer a loss of status as a socio-economic group.87 More worrying is that this technological and economic revolution will coincide with the global deterioration of the environment set out above, which also challenges the state.

Within this scenario, states in the western world will struggle just as much as states in the developing world. If the legitimacy of the state is measured in terms of its capacity to effectively administer a territory under its control, then the political context set out here poses a significant threat to this institution. The extraction of resources through taxation will prove extremely difficult as the tax base shrinks. This will affect the ability of the state to provide the public goods the population expects and requires. A weaker state, which lacks the resources and capacity to sustain the population, will also lack legitimacy; this could cause the social contract to break down and result in widespread violence. What, then, will the future grammar of war look like in this political and social context?

In this version of the future, the most fundamental aspect of the technology–war interaction will be the challenge to the state's retention of the monopoly of violence. Projections about the end of the state's monopoly on the use of force have been made before, but the current trajectory of technological change is making this threat more plausible, and bringing it closer.88 This speculative line of enquiry was given substance in 1999 by two colonels in the Chinese People's Liberation Army, Qiao Lang and Wang Xiangsui. Their study was conceived mainly within the context of a future war between the United States and China, and so their thinking was developed within the setting of a state-based conflict. However, their central thesis is relevant here because they believed the world was living in an unprecedented age in terms of the speed and breadth of technological innovation. There are, they argued, so many essential technologies emerging that it is difficult to predict how these will combine, or what the effect of these combinations might be in military and political terms. Developments in biotechnology, materials technology, nanotechnology and, of course, the information revolution are creating new opportunities and ways of attacking other states.89 An important observation made in Unrestricted warfare is that new technologies, which could be used as weapons, are increasingly part of our normal day-to-day lives.90 In sum, the colonels identified a range of non-military means that are technically outside the state's control and that might allow a weaker actor to fight and defeat their more powerful adversary. The 20 years that have passed since first publication of Unrestricted warfare have demonstrated the prescience of the authors in respect of what are deemed to be new types of conflict today. For example, what they called ‘superterrorism war’ seemed to come to fruition on 9/11. We can see how state and non-state actors have exploited emerging everyday technologies that challenge powerful nation-states. Of great importance is the way in which groups such as ISIS and revisionist powers such as Russia have weaponized social media in their efforts to weaken those who oppose them. ISIS, indeed, claimed that media weapons could be more potent than atomic bombs.91

It is believed that Russia is increasingly relying on non-military means to challenge the West. Not surprisingly, evidence is mounting that it influenced the outcome of the 2016 US presidential election.92 This form of activity is now a persistent feature of the conflict spectrum and is practised by a variety of states.93 In August 2018, Facebook closed 652 fake accounts and pages with ties to Russian and Iranian state-based organizations. In both cases, the objective appears to have been to influence domestic politics in the UK, the US, the Middle East and Latin America. Four campaigns were identified, three of which originated in Iran.94 With over 2 billion accounts to police on Facebook, it is feared this practice will persist.

It is not only because of the blurring of the distinction between military and civilian that more technology is becoming more accessible. Moises Naim points to the falling cost of many technologies used in both defence and the civilian sector, which is making them accessible to weak states and violent non-state actors.95 An excellent example of this trend can be seen in the domain of synthetic biology, a new field that combines the power of computing and biology to ‘design and engineer new biological parts, devices and systems and redesign existing ones for other purposes’.96 In 2003, the Human Genome Project completed the first full sequencing of human DNA. The successful completion of this project took ten years and was the result of work done in over 160 laboratories, involving several thousand scientists and costing several billion dollars. It is now possible to buy a DNA sequencing device for several thousand dollars and sequence a person's genome in less than twenty-four hours. So steeply, in fact, have sequencing costs fallen that the industry is no longer profitable in the developed world and is now primarily conducted within China. By way of example of the potential threat posed by this new science, in 2005 scientists, worried about the possibility of another flu pandemic, recreated the Spanish flu virus which during and after 1918 killed 50 million people in two years. In 2011, scientists employed these techniques to manipulate the H5N1 bird flu virus and create a variation which could be spread from the avian to the human species. It is feared the technical bar to entry into this domain is now sufficiently low that it can be exploited for nefarious purposes by individuals or groups.97 Precisely the same fears have been expressed about the cyber domain. According to one Israeli general, ‘cyber power gives the little guys the kind of ability that used to be confined to superpowers’.98 In the future, we might even be able to make weapons via 3D printers. In theory, it is possible to build a handgun or even an assault rifle with this technology.

However, before concluding that the state is about to wither away, we need to remember that these technologies are still maturing. Therefore, whether or not advances in the cyber domain will undermine or reinforce the power of the state remains a contested point. As Betz points out, launching a successful attack against another state via this medium can be very costly. The Stuxnet computer virus, which was used to attack Iran's nuclear programme, was a very sophisticated piece of software developed by a dedicated team of specialists over a long period. The successful insertion of this virus also required high-grade intelligence on the Iranian nuclear programme. Consequently, the success of a cyber attack depends on a combination of capabilities, not just the development of a virus, and at the moment this puts the state at a considerable advantage.99 A similar point can be made in the case of 3D printing: you need to do more than just download the code to print the weapon. You also need access to complicated and expensive computer-aided design software and a high-quality metal 3D printer capable of using steel, aluminium or nickel. Such a machine costs over US$100,000, which is nearly 60 times the price of a standard 3D printer which uses plastic. The latter has been used to print plastic guns, but these proved unreliable and likely to explode in the user's hand.100

Finally, technology will also allow the state to attempt to counter internal threats to its authority. Stephen Graham notes that a significant trend in the war on terror has been the blurring between civilian and military applications of technologies dealing with control, surveillance, communications, simulation and targeting. The capability to exercise control via technologies which are intended to provide a service, such as parking and congestion charging, has increased dramatically the opportunities to conduct electronic surveillance for a host of other purposes.101

Conclusion

‘War made the state, and the state made war’ is a maxim that has shaped our historical understanding of this relationship. In the West, the general absence of major war since 1945 changed the war–state relationship, and there is now a consensus that each is significantly less important to the other. My aim in this article has been to provide a more nuanced understanding of the war–state relationship that emerged after 1945.

The existence of nuclear arsenals made total or modern war obsolete. Within this strategic setting a new form of war emerged. Post-modern war did not require the state to mobilize its entire population and economy to fight a life-or-death struggle against other states, largely because its principal focus was on devising ways to use military power to deter war or devising new means to attack the enemy's moral rather than its physical power. As a result, the logic of war transcended simple notions of battle and victory. War between the Great Powers and their allies tended to be confined to the grey zone between peace and open violence. However, the drive for technological innovation, caused by the peculiarities of the Cold War, ensured that war and the state remained strongly connected, as only the state had the capacity to stimulate research and development on the scale required to ensure the efficacy of strategic deterrence.

The drift towards more capital-intensive modes of warfare continued in the post-Cold War era. Technology gave western governments the internal independence to prosecute wars because they demanded little sacrifice from society. In a period characterized by a plethora of politically unpopular ‘wars of choice’, this allowed states to employ force in pursuit of even vague, value-based objectives. Most importantly, these new means of war enabled nuclear-armed states to continue fighting each other in the space between war and peace using both military and non-military means. We have seen evidence of this in Ukraine and in the South China Sea.

This corporatist alliance between the state and private industry had impacts on politics, the economy and society, but in ways that did not conform with recognized patterns of behaviour associated with modern war. This is possibly why the war–state relationship since 1945 is viewed in terms of decline. However, the persistent debate about the existence of the MIC, admittedly a crude construct, is evidence of the survival of the war–state relationship and of its wider impact. The clearest evidence of this can be seen in the role played by military research in causing and accelerating scientific invention, which has been instrumental in bringing about dramatic economic, political and social change in contemporary western society. Most important of all are the non-military means created by military research which are now being exploited by both state and non-state actors. As Graham explains, western scientific research has gone through a cycle from defence to the commercial world and back again:

Hence, technologies with military origins—refracted through the vast worlds of civilian research, development and application that help constitute high tech economies, societies and cultures—are now being reappropriated as the bases for new architectures of militarized control, tracking, surveillance, targeting and killing.102

Looking to the future, the likelihood is that war will continue to have a significant impact on the state. Commentators today note with concern the ways in which technology is undermining the state's monopoly on the use of force as the technical and fiscal barriers to weapons production fall. However, capability should not be equated with intent, and people rarely decide to initiate violence without cause. For this reason, it is important to reflect on the political context, which will provide the policy logic for war in the future. The most important potential effect of projected technological change is transformation of the means of production, which could trigger huge economic and political turmoil in the West. If the fourth Industrial Revolution proves to be as disruptive as is predicted, this will lead to increased instability and possibly violence. These developments will weaken the state and damage its legitimacy as it struggles to fulfil the needs of its population. Western states may be able to deal with this transformation; but if it coincides with the predicted deterioration in the global environment, the institution of the state will struggle to bear the combined weight of the demands imposed on it. Under these circumstances, civil conflict might result. The irony here is that the technological preparation for war after 1945 sowed the seeds of the state's demise, playing an important role in creating the conditions that might cause a future existential crisis of the western state. Not only has that technological advance created the conditions for war, especially civil war, it has compounded this threat by democratizing the means of violence and empowering non-state actors. In the future, then, the war–state relationship could take an unexpected turn; and war might actually precipitate the unmaking of the state.