A snippet from a commentary on the first four chapters of The Wealth of Nations

CHAPTER IV. OF THE ORIGIN AND USE OF MONEY.

1. When the division of labour has been once thoroughly established, it is but a very small part of a man’s wants which the produce of his own labour can supply. He supplies the far greater part of them by exchanging that surplus part of the produce of his own labour, which is over and above his own consumption, for such parts of the produce of other men’s labour as he has occasion for. Every man thus lives by exchanging, or becomes, in some measure, a merchant, and the society itself grows to be what is properly a commercial society.

Chapter 4 is concerned with the economic institution of ‘money’, but it opens with a summary of earlier arguments and ends with a sentence that is strikingly linked to the subsequent development of political economy and to matters that are highly salient in economic discussions today. Here Smith characterises and gives a name to the kind of society in which he lived. It is a “commercial society” and its defining characteristic is that every man “lives by exchanging, or becomes, in some measure, a merchant”. ‘Every man’ is not to be taken literally, but more in the spirit of ‘Everyman’, a character in a story or play who leads an ordinary, prevalent and ubiquitous way of life which audiences can instantly recognise and identify with.

Going back to the illustrations from antiquity in Chapter 3, it can be observed that the coastal cities and cities on navigable waterways might be characterised as being ‘commercial’, but, heading off into their hinterlands, in much of the surrounding territory the ubiquity aspect will typically be found lacking. Smith would therefore not want to characterise them as ‘properly’ commercial. As in the earlier chapters, Smith’s centre of attention is the labourer, his Everyman, and the test is whether he/she “lives by exchanging”.

Today the typical word to characterise the economic system of a country like Britain is ‘capitalistic’, and that different terminology, which does not reflect well on post-Smithian intellectual development, is much more than the substitution of one word for another. ‘Capitalism’ is a vaguely defined concept. In ordinary use of language, it refers to an economic system for which it easy to find advocates and opponents, often at vigorous odds with one another, but without any shared understanding of what precisely it is that they are arguing about.

De facto, and whatever its original meaning may have been, capitalism has become an ideograph, as defined by Michael Calvin McGee: “an ordinary-language term found in political discourse. It is a high order abstraction representing commitment to a particular but equivocal and ill-defined normative goal.” As to its origins, its first appearance in English occurred in a Thackeray novel and it was only used a couple of time by Marx. Marx typically referred to the capitalist mode of production, but a mode of production is a rather different thing to a whole society and the manufacturing sector, with which he was most concerned, was only the largest of the three major economic sectors – agriculture, manufacturing and services – for a relatively brief period of the 19th century. For most of the second millennium, including for most of the period that might be labelled the Industrial Revolution, services has been the sector that has contributed most to national output.

The superiority of Smith’s characterisation can be seen by asking two questions. First, is modern China a capitalist economy? As befits an ideograph, that will likely generate much heated debate – which might launch a new set of sects populated with zealots – but little light. Then ask: is modern China a commercial society? The appropriate answer is, I think: ‘Yes, next question please’. That doesn’t cast much immediate light on other characteristics of Chinese society, but it does at least provide the base for fruitful exchanges on those other matters, based on a common understanding of a term of art in political economy.

Social contact networks and the spread of Covid-19

Epidemiology is much to the fore at the moment and the curves that feature in the epidemiological modelling are to be seen in TV, newsprint and many a social media post. The models used vary according to the biological characteristics of the relevant bug and there is a good summary of a number of their various types here, where a general reader’s scan of the list will suffice to give a flavour: https://en.wikipedia.org/wiki/Compartmental_models_in_epidemiology#The_SEIR_model

They work via analysis of the interplay between variables that are defined by medical states, linked via a system of differential equations, which may be deterministic or stochastic. The chief negative feedback mechanism is from the number of people who have been infected (N) – one of the medical state-variables (one of the ‘compartments’ or boxes) – to the rate of new infections. The more people that have been infected, the fewer in number are those who remain susceptible to infection (S). The strength of the feedback pressure tends to gather force over time, becoming greatest in the later stages,

There is also a positive feedback loop which is dominant early on – the more people that are infected the greater the infection rate (because the number of those doing the infecting expands). The changing balance between these two pressures determines the shapes of the contagion’s curves.

Looking at this picture, a social scientist will naturally ask: where are the social variables (where the word ‘social’ is to be construed broadly to include, for example, the settlement and movement patterns studied in human geography)? The bug is invading a highly complex, socially-ordered network of encounters between individual members of society. These encounters are very far from being random in general, though they may be in individual instances.

So, consider one the important parameters to be found in epidemiological models, the number of close (physical) contacts (CCs) that an infected person might have with others. This directly affects the basic transmission rate of the virus (i.e. Ro the number of infections of others that an individual might cause in circumstances where all contacts are with others who are susceptible to the disease), and we can expect that basic number to vary significantly from one individual to another, in reflection of the heterogeneity of the circumstances in which individuals come into close contact with one another.

Let me at this point, make a shift from epidemiological jargon to economic jargon and rename Ro as the ‘propensity to infect’ (PI). An average PI (API) can be calculated for the population as a whole, and it is such an average that is used when people talk about seeking to acquire ‘herd immunity’ by getting to a point where Ro < 1, in consequence of the negative feedback mechanism of a falling number of Ss.

What is often missing from the epidemiological models, however, are the correlations between variables which can arise from the existence of the social structures. Let me look at just one such correlation (since the aim of this note is to encourage thinking about the contagion in a different way, not to expound any new comprehensive view of it). It is that the close contacts (CCs) of many with high propensities to infect (PIs) will not be random, but will be tilted toward CCs with other high PI people, because of the social networks in which they move.

This could be true for professionals who work and socialise with other professionals (and they do tend to network quite a lot), or for inhabitants of multi-occupancy buildings in socially deprived areas. An immediate implication is that the early stages of a contagion will ‘naturally’ select for people at each end of a particular transmission who are both high PI individuals, and hence that the early spread could be very rapid.

What taking this type of social factor into account requires is the addition of new equations to the differential equation system of an epidemiological model. That will obviously affect the solutions of the equations: in this particular case what the additional correlation implies is the introduction of another, negative feedback loop/mechanism (additional to the negative feedback that comes from a reduction in S as more and more people are infected). Once the contagion gets going, it immediately starts to thin out those who are high transmitters of the disease.

To illustrate let’s take an extreme, hypothetical, ‘teaching’ simplification, which introduces social differentiation in a very limited way: in effect, it divides the population into two types of people with identical, intra-type characteristics (and goes no further than that). There are ‘Highs’, with a PI of 5 and who make up 30% of the population, and there are ‘Lows’, with a PI of 0.8 and who make up 70% of the population. The API(Ro) for the population as a whole is 2.06 (5*0.3+0.8*0.7). It is assumed that the Highs and Lows never have inter-type close contacts. How in this case does the infection go?

Not very far is the answer, if the virus only has a foothold among the Lows. It peters out almost immediately. Given a foothold among the Highs, however, it will progress very rapidly at first and then slow as the number of Highs who are still susceptible to the disease decreases (because increasing numbers have been infected).

To track infections in the latter case it is necessary to solve out for the differential equations for the High part of the population, but it is easy to see that infection of 80% of the Highs would be more than sufficient to bring their API down to API (Ro) = 1. The feedback loop from the reduced number of High Ss would also be in play, so the number of infections required for API =1 (among Highs) can be expected to be lower than that. If it were, say, 80% of 80% = 64% of Highs, that would translate into an infection rate for the population as a whole of 19.2%.

This illustrates an important point, which has major implications for policy trade-offs: The early ‘selection’ of high API individuals clearly assists the bug in moving swiftly in its first stages, but it comes at a price (for the bug): the spread may peter out when infections reach only a modest fraction of the total population (like say 19.4%). The rapid early progress is bought at the price of not reaching an otherwise potentially large fraction of the population at a later stage.

The problem for the bug is that the ‘selection out’ of high PI transmitters is, at least with the kinds of numbers assumed in the illustrative hypothetical, such a powerful factor in reducing the API. In terms of whole population averages, when 10% of the whole population in the example has been infected, the API will have fallen from 2.06 to 1.56 as a result of this feedback pressure alone. At 20% infected, that becomes an API of 1.06.

Setting the crude simplification aside, the more general point is that there is a ‘social-contact selection’ process at work:  particularly in the contagion’s early stages, in a social setting the natural selection of the biology picks out high API victims. But, as it does so it has the immediate and direct effect of reducing the population API, which occurs with none of the slowness associated with the negative feedback pressures that build from the reduction in the number of susceptibles (S).

What I think is best called ‘community resistance’ – which is the combined effect of all the negative feedback loops in play (two in the illustrate example, more in practice) – will then be determined not only by the aggregate numbers of those infected and those still susceptible (the prime focus of the epidemiological work) – but by the social characteristics of the heterogeneous individuals who make up the numbers. Who are they? Where do they live? What are their living conditions? Where do they work? What do their social (physical contact) networks look like? And so on. Without some knowledge of those things, it is, I think, impossible to understand the contagion in any detail. We are left to play guessing games.

A second general point that follows from all this is a classic health warning, given by all good advisors to politicians contemplating a perturbation to a social system: beware of unintended effects/consequences. In the case of Covid-19, it is easy to see that the social distancing effects of a general lockdown are bound to have a negative direct (all other things being equal) effect on the rate of spread of the virus. At least in its less mitigated forms, however, it is an undiscriminating, one-size-fits-all policy and, as such, it hinders the ‘natural’ selection process that, at least arguably, has done most to inhibit the spread of the virus.

Whether such a policy will, in the end, actually reduce the cumulative deaths attributable to the contagion is, therefore, very much an open question at the time of writing:  it cannot safely be presumed.  If a bug’s eye view of things suggests that finding its way quickly to the infection of high transmitters is good for it in the short term, but not so good for it in the longer term, that intelligence might be useful information for humans contemplating how best to counter the harms it brings.

Finally, to repeat a sentiment expressed above, this short piece is directed at encouraging policymakers to look at the issues in a different light, a light that illuminates the social factors that are involved in the contagion. Man is a social animal embedded in a highly complex social ecology, not a herd animal, still less like those atoms in the kinetic theory of gases that move about randomly, bumping into others as they do so. We should not lose sight of that.

Regulatory Policy Assessment in the Covid-19 era: a Once and Future Pathway?

Back in the 1990s and the early years of the 21st century the UK government developed a relatively sophisticated handbook to guide the evaluation of alternative lines of regulatory policy development and implementation, largely under the stewardship of the Cabinet Office and the Business Department. This was part of an international movement in which it is fair to say that the UK played a leading role. By way of example, the UK promoted the establishment of a ‘Directors of Better Regulation’ group for Member States of the EU, outside of normal EU structures, where comparative experiences could be discussed and know-how could be shared. Other innovative institutional developments of the time included the establishment of the Better Regulation Executive and the Better Regulation Commission, itself a successor to an advisory Better Regulation Taskforce dating from 1997.

Disruption and resistance

The progress of this ever-increasing knowledge base was disrupted in the later part of the first decade of the new century (I tend to date it as ‘around 2007’, that year being a fateful one across a range of public policy matters, both locally and globally). Examination of the causes of that disruption is a complex exercise, but, in my own bailiwick at the time (energy sector regulation) the proximate cause was a growing tension between the implications of thought-through policy assessments and an emergent conventional wisdom on climate change policy. Behind that, and more fundamentally, was what in the Covid-19 era might be called a very major co-morbidity: Leviathan’s very powerful ‘immune system’.

By that I mean the resistance of our system of government to any intruding ‘intellectual virus’ that might significantly alter its own cellular activity, irrespective of the wider consequences, for good or ill, of such alterations for the public. In the most powerful, imagined conception of Leviathan’s immune response, George Orwell gave it the name ‘Crimestop’ in 1984’s Newspeak: “… the faculty of stopping short, as though by instinct, at the threshold of any dangerous thought. It includes the power of not grasping analogies, of failing to perceive logical errors, of misunderstanding the simplest arguments if they are inimical to Ingsoc, and of being bored or repelled by any train of thought which is capable of leading in a heretical direction. Crimestop, in short, means protective stupidity.”

Back in reality, a much more utilised common-or-garden form of defence mechanism is what I have called convenient, selective myopia, roughly meaning “don’t look at things you don’t want to see, ignore them”, a form of wilful blindness.

The original intent of regulatory impact assessment (RIA) was to discover, structure and analyse information that was judged relevant to an upcoming decision. Importantly, the intention/purpose of the exercise was to better inform those who would take the decision. It was not intended to be a cost-benefit analysis in which benefits and costs are monetized and totalled, still less a procedure via which the decision itself should be determined on the basis of a net-benefit criterion.

There is good reason for that: only those to whom decisions are formally entrusted have legitimate authority to place values on the assessed effects of regulatory measures, involving, as they not infrequently do, quite difficult balancings of positive and negative impacts/consequences on/for different communities and interests.

From a wider public perspective, the notion of ‘informative’ regulatory impact assessment (RIA) may look like a friendly intellectual virus, conferring only good (better information for decisions), but that is not typically the view of decision makers themselves. The problem lies in the risk that the assessment process will likely make transparent things they do not want discovered and made transparent. Considered decisions are not necessarily, indeed arguably rarely are, the decisions that bureaucrats and politicians want to take, and that is in large part because decision makers tend to possess ‘partial’ (meaning partisan or ‘private’) agendas of their own, distinct and separate from a wider public interest agenda. Individual politicians almost invariably want to increase their own power, civil servants are partisans in the cause of the size and influence of their own departments or units, and so on. Leviathan’s senior servants can, therefore, be expected to be, to varying degrees, resistant to evaluation processes that might threaten these interests.

In this struggle the RIA virus has a weakness: it itself is a process that takes time, a virus that, to push the metaphor to its limits, mutates as new information is discovered and new thoughts occur before it reaches its final form. As used to be said of the core working document itself, it is a “living document”. This lapse in time gives, to those to whom its unfettered expression might be unwelcome, ample opportunities to insert themselves into the development process and re-engineer it. Indeed, over time, it becomes unnecessary to do even that. The assessors tend to come to recognise the likely negative responses to certain types of evidence and of certain lines of thinking and, to avoid inevitable later hassle and near inevitable defeat, self-re-engineer the process from an early stage onwards. That which was intended to inform decisions is flipped to become a process driven by the goal of justifying a decision soon to be made, without the intrusion of inconvenient facts and considerations.

The QuickScan

In this battle between impartial (non-partisan) assessment and the partial agendas of officialdom, one idea that came out of the EU Directors of Better Regulation Group, originating from the know-how of a Dutch regulator, was what researchers attached to the Regulatory Policy Institute (RPI) came to call a QuickScan. This was conceived as a first, short-duration exercise that would take a wide-angle look at the relevant issues, broadly asking “What things do we need to examine and explore in order to ensure that all relevant information is available for the making of upcoming policy choices?” Doing that at least gets a wide set of considerations placed on the table in the form of a first, speedily produced document that can later be used as a checking mechanism: if something identified in the QuickScan is later ignored or omitted, there should be substantiated reasons for so doing.

Such a process provides at least some degree of defence against disruption by Leviathan’s antibodies, and the wide vision/perspective of it is perhaps to be particularly stressed. It is interesting to note, for example, that criticisms of Government responses to the Covid-19 crisis have increasingly been based on a perception that relevant authorities have taken overly narrow views of the crisis and have become unduly fixated on a narrow range of issues, problems and questions. Moreover, exactly that same critique has been made of the conduct of banking and financial supervision ahead of the 2008 crash. It was, for example, a key part of the confessional letter sent to the Queen by a group of Fellows of the British Academy in response to a (rather good) question of Hers when visiting the LSE in the post-crash period: “Why had nobody noticed that the credit crunch was on its way?” And the critique of narrowness can, I think, be generalised.

Regulatory measures can be viewed as perturbations to a complex, adaptive, socio-economic system, i.e. as perturbations to an ecosystem. It cannot be generally assumed, therefore, that the effects of such perturbations will be limited to a narrowly confined part of the system, the parts of which are generally interconnected in one way or another. It is clearly an impossible task to identify all the consequences of a measure, but there is requirement for a sensibility that calls for identification and consideration of at least the most salient effects and for recognition that these might be neither localised nor intended by those ultimately responsible for decisions.

It is rare for a decision maker to want to see harmful consequences of a policy with which he/she is associated: it is much more common for them to want to claim positive consequences for their actions, whether or not they have any causal links with the perturbation. Therein lies an obvious problem. If negative consequences exist, the bias will be toward leaving them unexamined, for, if they are assessed, it will no longer be possible to claim later that they were unanticipated consequences and, as such, that less blame for them is merited.

This is what convenient, selective myopia (wilful blindness) looks like. The Competition Appeals Tribunal has referred to the approach as ‘pixelated’ – likening it to a propensity to focus on some blocks of pixels in a digital image and to ignore others – and that is a term that the RPI has also taken up in some of its work and thinking.

The concept of a QuickScan has some affinities with a 2006 proposal from the Better Regulation Commission (BRC) to establish a unit that it called the Fast Assessment of Regulatory Options (FARO) Panel, to examine calls for urgent government action in the event of some major health or safety risk such as an epidemic or a major rail crash. The BRC was, however, itself a victim of the disruption of the time: it was soon abolished and the proposal, like the QuickScan, was not taken up. The two approaches did, however, differ in at least two important ways.

First, the BRC sought the establishment of a distinct, new institution/unit, described as follows: “The Panel should be independent, politically neutral and external to government. It should provide timely advice to ministers on appropriate, cost-effective responses which have a real impact, having considered all aspects of the risks involved, trade-offs, priorities and policy alternatives.” In contrast, the QuickScan concept did not call for the establishment of a ‘sitting panel’ external to government, but rather for a capacity and willingness to assemble teams within departments or government agencies, including the sectoral regulators. Such teams would be determined on the basis of the requisite skills and knowledge, i.e. skills and knowledge that would be of value in the specific context of the relevant policy issue. Even within a department or agency specialised in a particular area of policy, such as communications, transport, energy or health, the wide variations in contexts almost necessarily imply some rotation of members from case to case, including specialists brought in from outside government.

Second, FAROP was explicitly designed to respond to situations in which heavy pressures on politicians had led them already to conclude that, with high probability, ‘something must be done’. As its name implies, fast assessment of options was intended to focus only on options, on the ‘something’ that it would be best to do. The proposal sought a pause for thought between the first political conclusion and the eventual response. The hope was that this would help foreclose disadvantageous, knee-jerk policy responses. The focus is therefore narrowed at the outset (to options) and the evaluation process is necessarily ‘pixelated’.

The QuickScan, in contrast, is much less constrained. The whole purpose is to develop a wider field of vision so as to be able to better respond to the incoming results of the discovery process that is entailed by good regulatory impact assessment, of which it is the starting point. It is the ‘good start’ of what became the RPI’s unofficial motto, an Irish proverb, Tús leath, na hoibre, a good start is half the work/journey. Consistent with RIA guidelines, that always encompasses the option of doing nothing or, more accurately, doing nothing for now. More importantly, it starts with a detailed analysis of the problem or challenge to be faced and, in particular of its context.

Verstehen, Verstehen, Verstehen

At this point I come to the most important of the concepts in play in policy evaluation, namely ‘understanding’ or ‘verstehen’, the core notion in just about all the RPI’s own work. The first step is to understand the issues and their context, which in the latter case involves understanding the workings of the relevant parts of the ecosystem so as to be able to assess how their functioning might be affected by any potential policy/regulatory perturbations.

The concept of Verstehen comes to today’s social sciences chiefly via a German scholastic tradition, associated in particular with Max Weber, where it means an approach that examines a socio-economic question from the perspective of each of the potentially many actors who might be involved or affected, asking how do/will they see things? In other words, it asks that the analyst be able to ‘put themselves in the shoes’ of those likely to be affected by policy measures, or by the lack of such measures, and see things from their point of view, taking account of their attitudes and their behaviours. It is therefore an activity that engages both empathy and imagination.

The approach is older than these 19th and 20th century developments, however. For example, in explaining his own work the great 17th century Dutch philosopher Baruch Spinoza said: “I have diligently tried not to laugh at human actions, nor to mourn them, nor to abhor them, but to understand them.” Perhaps surprisingly to many, particularly given that it is a foundational document in the history of political economy (and hence economics) in the English speaking world, the most systematic exposition of the approach is to be found in Adam Smith’s Theory of Moral Sentiments (TMS). Throughout the work, Smith invites us to put ourselves in the shoes of others, both actual others and, central to moral and social judgments, a hypothetical ‘impartial spectator’, someone who does not bring any private/partial agendas interests to their judgments.

A flavour of the reasoning can be gleaned from one of the most cited passages of the work in which Smith criticises the approach of what he called a ‘man of system’. Today this term might be used to characterise someone with a proclivity for central planning or with fixed ideas about how things should be done. Since the TMS receives little coverage in modern economics courses, it is, I think, worth quoting the passage in full.
“The man of system is nothing like that. He is apt to be sure of his own wisdom, and is often so in love with the supposed beauty of his own ideal plan of government that he can’t allow the slightest deviation from any part of it. He goes on to establish it completely and in detail, paying no attention to the great interests or the strong prejudices that may oppose it. He seems to imagine that he can arrange the members of a great society as easily as a hand arranges the pieces on a chess-board! He forgets that the chessmen’s only source of motion is what the hand impresses on them, whereas in the great chess-board of human society every single piece has its own private source of motion, quite different from anything that the legislature might choose to impress on it. If those two sources coincide and act in the same direction, the game of human society will go on easily and harmoniously, and is likely to be happy and successful. If they are opposite or different, the game will go on miserably and the society will be in the highest degree of disorder all the time.”

The importance of understanding is here in part highlighted by considering the consequences of its absence. The ‘man of system’ has fixed ideas to which he is attached. He pays no attention to the ‘great interests or strong prejudices’ of those who might be affected by his plans. Other social actors are assumed to be passive (a false assumption), like chess pieces, sitting waiting for the legislature (or, more likely today, the executive branch of government) to move them around. There is no understanding that each member of society (an assumed chess piece) has her/his ‘own private source of motion’. In consequence “the game will go on miserably” and the society will be in the highest degree of disorder all the time”.

It can be noted as a general point that Smith here is not calling for a laissez faire approach to public policy, something that he never did. Rather he is calling for an alignment of policy to the data, as that word might be used in philosophy (‘things known or assumed as facts, making the basis of reasoning or calculation’), where, critically, that data includes the views, beliefs, attitudes, intentions, heuristics and understandings of others who are caught up in the relevant situation (the ‘social facts’) and hence of their own likely conduct in the light of these things. The argument is that such alignment will lead to an outcome in which ‘the game of human society will go on easily and harmoniously, and is likely to be happy and successful’.

This is much closer to the ancient Chinese Daoist concept of wu wei (roughly ‘effortless effort’) than to laissez faire. Smith uses the word ‘natural’ to signify the functioning of the complex, adaptive socio-economic system without the perturbations of Leviathan’s hand, and he calls for that hand to be applied in ways that are complementary to and support that functioning, that ‘go with the flow’, not in ways that seek to substitute for it or stand in opposition to it. And experience teaches that, all too often we see regulatory policy interventions that create negative feedback loops that resist the intended effects, because they go against the grain of the system to which they are applied.

Verstehen as it appears in modern social science, then, emphasises seeing things as others see them, but that is only part of the relevant data. Typically, what all social and economic agents are looking at, from different angles, is the same context, the same ecosystem, of which all are part. For the political economist, therefore, the understandings of the actors provide only part of the picture of interest, which is the functioning of the system as a whole. It the aim is, so far as possible, to align policy with something, it is a good idea to understand how that something functions.

Since the relevant context invariably features the functioning of a complex system, a commonality of these different aspects of understanding is that each requires that matters be examined from a variety of different perspectives, from different viewpoints. In political psychology the ability of any one individual to be able to do this is captured in the notion of a cognitive style measured by ‘integrative complexity’: ‘a research psychometric that refers to the degree to which thinking and reasoning involve the recognition and integration of multiple perspectives and possibilities and their interrelated contingencies. Integrative complexity is a measure of the intellectual style used by individuals or groups in processing information, problem-solving, and decision making.’

Elsewhere in the Theory of Moral Sentiments Smith explains why it would be impossible for any ‘man of system’ fully to comprehend the workings of any significant part of a socio-economic system: the information required is too vast. Hayek would later call a belief otherwise “the fatal conceit”. An individual naturally inclined to integrative complexity, such as Smith’s notional ‘wise sovereign’, can get a little bit of the way. A team of people, with diverse skills and experiences, can probably get as far as can be got. From these points two rules of thumb for assessment purposes might be inferred (a) first do no harm and (b) it takes a team, a team committed to achieving the best feasible understanding of things. And that’s a high-level intellectual exercise, not a routine bureaucratic task.

Thoughts on the novel coronavirus contagion

The above reflections were triggered by observing governmental responses to the arrival of Covid-19 in the UK. As in relation to Smith’s chess metaphor, it provides an example of being able to see the importance of Verstehen by reference to events that might occur in its absence. It is, however, a different context to Smith’s example, in which a ‘man of system’ was not the source of major problems: that term would not be an appropriate characterisation of the UK Prime Minister! Rather the Government found itself in an immediate cloud of uncertainties and struggled for want of early but considered, wide-vision advice.

The Government did, naturally, turn to experts for advice and (also naturally) first to epidemiologists and experts in medicine and the functioning of the National Health Service (NHS). Such were clearly required, but they by no means encompassed the full diversity of expertise and experience relevant to the relevant assessments. Notably absent were social scientists – I do not count the behavioural specialists here, since their focus is on behaviour modification, not on understanding — and also those with knowledge and know-how concerning the characteristics and functionings of networks.

The latter may seem to be unusual, suggested additions to a team, yet, putting oneself in the shoes of a hypothetically intelligent virus, the first thing the bug might have considered in its own strategy development was the networked structure of the ecology it was about to invade, with all its weaker and stronger lines of defence. That ecology is a socially and economically constructed network of connections between the multitudinous individual, potential hosts of the virus. It is not an animal herd, which has a much less structured and much less differentiated pattern of interactions between individual animals.

Just stating this obvious point is enough to demonstrate how easily thinking can go wrong at the very outset. The very language used can, almost instantly, lead to neglect of relevant socio-economic data, constituting a major failure of assessment. The vision is narrowed and a blind eye is turned to relevant evidence.

A QuickScan process would/should have had no fundamental difficulty in addressing the uncertainty problem: the uncertainty is a blindingly obvious feature of the context. Its existence is a relevant piece of data: we can see a fog from a stationary position, even if we can’t see through it, and seeing it will affect how we proceed.

As a quick, wide vision, first-take of a challenge ahead, the regulatory assessment is intended to be a way of mapping out areas in which the discovery of new information can be expected to be particularly valuable and a way of charting the first steps forward. A first map is necessarily devoid of great detail, but it can identify areas where further, important discoveries might be made. The difference between Covid-19 and other issues for which a QuickScan might be deployed is largely one of degree, arising from the particularly high salience that a Covid-19 assessment would attach to the acquisition of new knowledge (the discovery process) and to the speed at which it would need to be achieved.

Moreover, a widely drawn team would, like a hypothetically intelligent bug, surely have recognised that the phenomenon at hand was a network contagion, with at least some characteristics in common with, say, a banking collapse or a power failure in an electricity system. Understanding of the latter problem, which runs deep among the relevant experts (not least because of the regularity with which it has to be addressed) would likely have particularly helped in framing thinking. The notion of ‘circuit breakers’ is central, both to contain the scope of a particular failure and, if that cannot be done in time, to protect those parts of the system where most harm could be done, for example by isolating a very localised part of the network and allowing on-site generation of power to replace public, grid supplies.

In fact, policy development in the face of Covid-19 did get to ideas of first suppression, then delay – a stage not to be found in electricity systems because the speed of the contagion in that case makes the speed of Covid-19 look like the slowest of snails – then ‘shielding’ (the hospital with its own, on-site back-up generators). Where discovery (or re-discovery) has been less successful, however, concerns the regulatory principles of targeting and proportionality, which the early developments in RIA had so clearly established, but which appear forgotten now.

It is to be expected, for example, that a QuickScan would have quickly recognised that, if a ‘shielding’ strategy was potentially needed at a later stage, it’s success would require urgent, not-to-be-delayed measures to put in place the necessary back-up arrangements, e.g. for hospitals, care homes (with circa 500k inhabitants) and households that are co-habited by older (at high risk) and younger (at much lower risk) generations (they account for about 15% of all households in the UK, but a significantly higher percentage than that in some communities). Hospitals may have back-up electricity generators, but there has been no equivalent attempt to develop emergency operational requirements for shielding in the face of events like the Covid-19 contagion. Shielding arrangements therefore require de novo development and that would pose a very major challenge.

The benefits of a QuickScan team dedicated to the pursuit of integrative complexity are partly illustrated by the electricity example above, but they are of more general value in answering an early diagnostic question that should be asked when seeking to meet a new, complex challenge: have we seen a problem like this before? It is a question that should always come with a health warning: in answering, don’t over-privilege the previous experience. The purpose of the question is to provide initial lines of attack for thinking about a new problem, not to create ready-made options for tackling it. Experts with different experiences will be in a position to come up with different, analogous problems that they have seen before, each potentially, partially informative. In this case there is one, immediately obvious answer, SARS. Looking at pre Covid-19 summaries of the SARS experience, the similarities are evident, and both viruses are part of the same family. Having failed to use previous experience to prepare for the next pandemic, the next best thing might have been to have looked quickly at nations that had learned from previous experience and were more fully prepared for the appearance of the novel virus, the most notable being South Korea,

Instead of these things, and not alone in the world in this, the UK Government has come to rely upon general lockdown and social distancing measures, which take no account of targeting and proportionality principles. At the time of writing, exit from these measures appears highly problematic. There is, therefore, indeed a threat that “the game will go on miserably and the society will be in the highest degree of disorder”, if not for all of the time, then at least for some time to come.

How to win incremental votes in the forthcoming UK General Election

The forthcoming General Election (GE) will not be exclusively concerned with Brexit, but Brexit issues can be expected to loom large. These issues are widely viewed as having disrupted party loyalties and moved electoral politics into a wholly new context, and they have had a disorienting effect on many.

In this essay I want to develop the argument that (a) there is a serious mismatch between the Brexit policies being offered by UK political parties and public preferences/attitudes on relevant Brexit matters, (b) that part at least of the mismatch is attributable to the ‘strategic incompleteness’ of the policies on offer, which in turn appears to be linked to a continuing weakness in analysing the sequencing of policy choices, and (c), as a corollary, there are potential votes to be won by filling in some of the strategic gaps.

The policy stances

First, consider the major national parties’ policies as they appear to stand at the time of writing (noting that they have shown a tendency to move around somewhat):

• Conservative Party (CP): Pass the Withdrawal Bill; sign the Withdrawal Agreement; withdraw from the Treaty of Lisbon in January; negotiate & implement a fairly standard FTA with the EU by the end of the transition period (currently set at 31/12/2020).

• Labour Party (LP): Renegotiate the Withdrawal Agreement and/or the Political Declaration to point them to a ‘softer’ Brexit that currently lacks any clear specification, but comes with a strong indication that it should include a CU; then put the provisional agreement to a confirmatory referendum (likely requiring a further extension of the A50 period of around 6 months or more); indicate that the LP, or at least most of its MPs, might vote against the new agreement and in favour of Remain.

• Liberal Democrats (LDs): Revoke the Article 50 notification and remain a member of the EU.

• Brexit Party (BXP): Put the Withdrawal Agreement in the trash can; negotiate a further extension of the A50 period to 30 June 2020; negotiate and implement an FTA with the EU by 1 July 2020; if that proves impossible, simply leave the EU without any overarching trade agreement in place (the No Deal outcome).

Even recognising the necessity of presenting policies in simplified forms, the public is ill served by these offerings. They are saturated with fantasies (about what could be realistically achieved and when), fail to specify policy relating to important elements of the form of separation, and are suffused with vagueness. By way of examples:

• The notion that a new FTA could be negotiated and implemented by end 2020 (CP) or, a fortiori, by end June 2020 and in the absence of a Withdrawal Agreement, is not credible (BXP). It is simply wishful thinking.

• No LD policy is specified for the realistically possible circumstances in which the Government succeeds in achieving withdrawal from the EU by end January 2020 (it’s one of the few aspirations in the list which is realistically attainable), yet those circumstances could eventuate within a few weeks of the General Election. That would render ‘Revoke’ meaningless, leaving the LDs with no policy at all.

• The LP’s notion of the ‘soft’ Brexit it wants to negotiate is ill defined, its feasibility is unexamined/unexplored, and its own position in any subsequent referendum is left vague. It is, in effect, a policy not to have a policy other than rejection of the No Deal possibility, an extended exercise in can kicking.

Part of the general problem is that, when discussing Brexit options, there has been a constant tendency to (i) conflate withdrawal issues and future relationship issues (despite equally constant warnings from experts in the wings that the two should be carefully distinguished), and (ii) conflate the transition period defined by the WA with what I have elsewhere called the ‘interim period’ stretching from Brexit day to the implementation of any new future relationship agreement, see https://gypoliticaleconomy.blog/2018/09/15/brexit-sequencing-and-the-interim-period-problem-limbo-in-our-time/ .

For example, there is constant reference to the WA as ‘the Deal’, when much the more important policy issues are connected with the future relationship agreement (or absence thereof).

Public attitudes

This dog’s breakfast menu put before voters hinders the tasks of (a) informing the public about the relevant issues and trade-offs and (b) inferring public preferences from polling results that, all too frequently, pose chalk and cheese alternatives.

In the general confusion, I continue to rate the attitudinal studies of researchers at King’s College London as the benchmark for sound analysis. It is focused unambiguously on future relationship matters which, in shorthand form, are encompassed by the general question: how close a future relationship do you want to see between the UK and the EU?
(A summary is to be found here, and it is well worth a read in the current context:
https://ukandeu.ac.uk/we-asked-the-british-public-what-kind-of-brexit-they-want-and-the-norway-model-is-the-clear-winner/ )

The potential answers to that question are obviously not binary, i.e. very close or remote: the degree of ‘closeness’ in not to be measured by a single, 0-1, bit of information. The safest inference from the referendum result is simply that a majority of those voting, whose number is almost certainly underestimated by the actual Leave vote (because significant numbers of remain votes will, rationally, have reflected an explicit or implicit assessment that change would just not be worth the hassle), were of the view that a relationship less close than that defined by the Treaty of Lisbon would be preferable.

The KCL study delves deeper into this issue, examining public attitudes on the various, detailed trade-offs that are involved. It does so by working within a conceptual framework that has been a standard part of economics teaching and research for many decades. By way of example, when assessing the likely value of a prospective, complex product that might be put on the market (and policy offerings can be viewed as political ‘products’), the approach seeks to define the main, component characteristics of the product (for a car: engine size and type, fuel efficiency, diesel/petrol, seating capacity, …) and to set about discovering evidence on the values placed on the individual characteristics by consumers. The results can then be used to assess whether the new combination of characteristics in contemplation would be a winner in the market.

Thus, instead of asking the public about their views on, say, a CU – a concept about whose entailments people are likely, like Mr Clarke, to have highly limited knowledge – the main characteristics of a CU are first identified (e.g. ‘how important do you rate the ability of the UK to negotiate its own FTAs’) and respondents are asked about attitudes to them, without mentioning the concept of a CU itself. One major advantage of this in current circumstances is that the meaning of terms like Customs Union and Single Market have become heavily polluted with political associations that influence responses. For example, ‘Mr Mogg dislikes CUs, Mr Clarke likes them, and in general my views are much more closely aligned with Mr Clarke’. (Here, Mr Clarke would be afforded ‘epistemic authority’, even though he is close to clueless on the economic issues at stake and took the CU idea on board for reasons of political expediency).

The kinds of characteristics examined in the KCL study encompassed areas like freedom of movement rights, ability to conduct an independent commercial policy, budget contributions, regulatory influence, etc. Intensities of preference of respondents were then aggregated into four bundles that matched four future relationship outcomes: Remain, EEA (Norway), a CU, and No Deal (meaning no future relationship agreement of any significant depth). Each respondent was then allocated to the bundle/label for which that individual’s valuation of the combined characteristics was highest. Results were as follows.

Rohr et al

Two points to note immediately are:

• Although there is a substantial body of support for the characteristics of the two end-of-spectrum options (Remain and No Deal), over half the sample had preferences more closely matched with one of the other two options (EEA/Norway and a CU).

• A significant volume of ‘switching’ was recorded in the two years covered by the research, although the totals in the two years are not much different. An immediate inference is that is that there are substantial number of voters who could easily be tipped into another category, i.e. they are not deeply attached to just one option. The thickness of the bands running from Remain to EEA, from EEA to Remain, and from no deal to EEA is striking in this regard. Public attitudes appear much less polarised between outcomes than parliamentary, party activist and media attitudes.

This last point is underpinned by more familiar polling results that indicate that the EEA is viewed by a substantial (not a marginal) majority of respondents as an ‘acceptable’ way forward. Moreover, the results come with an obvious intuition. The underlying trade-offs involved are many and public attitudes to each can be expected to be differentiated. Aggregating over these trade-offs can be expected to lead to a spectrum of valuations of ‘closeness’, not a simple binary division. The latter (the binary division) comes not from public attitudes, but rather from the nature of the referendum question: should the UK remain in or leave the EU?

Given the underlying data on more disaggregated public preferences, it is possible to construct and evaluate public attitudes to other broad policy positions, such as EEA+ (aka Norway+ or CM2.0), which is a combination of EEA/Norway + a Customs Union (CU).  The researchers report that they have done this and the results indicated by that exercise are a good demonstration of the value of the approach adopted. Prima facie it might be expected that Norway+ would be closest to the preferences of a larger number of people than unadorned Norway, but the reverse is true. The proportion of the sample for whom the EEA + CU would be the nearest approximation to their preferences falls significantly from the 42% shown for the EEA in 2018.

The reason is that many respondents placed a significant value on the prospect of the UK once again having an independent commercial policy and EEA + CU, unlike EEA alone, would not offer that. These people would, if Norway+ were the only middling option on offer, tend to be switch chiefly to the No Deal category, which in a three-option categorisation, would be the only one offering an independent commercial policy.

That may come as a surprise to some politicians and commentators who read the plus as signifying something that would add value. It is not at all a surprise after a moment’s pause for thought. It implies the delegation, by a major (and ex hypothesi independent) trading nation, of its international trading policy to politicians and bureaucrats of another jurisdiction, and that is not at all a normal occurrence.

The bottom line is that a CU option is a vote loser, which is likely one of the reasons why some of those who have thought about the matter in Paris and Berlin tend to the view that it is unsustainable as a longer term future relationship between the EU and a non-EU UK (another is experience with the Turkish arrangements). Would France and Germany delegate their international commercial policies to a foreign power, with only marginal influence on the conduct of such policies? No, they would not.

Matching policy stances and public attitudes

If the attitudinal results are mapped into the binary question of Remain/Leave, with the additional assumption that the CU category would tilt to the Remain side of the binary, the implication is that, other things equal, the public splits around 65/35 in thinking that the EU is ‘too close’ a relationship for the UK, i.e. that the UK is better out. That, I think is in line with the wider evidence. The UK is, after all, opted out from the EU’s major project, monetary and fiscal union (and the separating effects of that are increasing over time), there is little joy at the prospect of an EU army, and a positive case for EU membership has been noticeably lacking over the past three plus years of Brexit discourse (it has all been very defensive – ‘hold on to nurse for fear of something worse’).

The elevation of the CU issue has been a product of an inward-looking, factional domestic politics, accompanied by large dollops of wilful ignorance and bad faith, but that is only one example of how domestic parties have found themselves wandering in blind alleys and dead ends, remote from where the public would like to see the country. The prospect of a GE, however, allows scope for breaking out of the deadlock.

Given the size of the ‘unserved’ or ‘unrepresented’ (by any major political party) middling view – roughly that the EU is too close a relationship, but considerations of history and geography indicate that the UK should still seek a relationship that is significantly closer than that which it has with most nations on the planet — it would be normal in a democratic system for parties that are more polarised to reach out into that middle in the search for votes. In the business analogue noted above, a company that spotted a large, unserved demand for a product with a combination of characteristics that it could feasibly offer, would likely be leaping at the opportunity to acquire new customers.

What we observe instead is the BXP and LDs going full throttle for the extremes of the ‘closeness/distance spectrum’, the CP being pulled toward the No Deal end of the spectrum in its competition for votes with the BXP, and the LP wandering in space, but drifting slowly toward the Remain end. Given this, the nervousness that is palpable on all sides is understandable. Each strategy is vulnerable to a pivot toward the middle by a competitor (in the business analogue, a rival firm could get to the new product first).

That pivot is most easily achievable by the LDs, because the current LD strategy suffers from an obvious and very major ‘incompleteness’: it contains no statement as to how it would position itself in the (realistically possible, perhaps even likely) event that, in less than three months’ time, the UK will have left the EU. All it has to say is this: ‘Whilst Revoke is our first priority, if that should fail our policy will be to #rEEAmain, i.e. we will fight to keep the UK in the EEA. That exploits the fact that the Treaty of Porto (the EEA Agreement) is a separate treaty from Lisbon and, whilst the WA would see the UK out of the latter, it does not provide for exit from the former. It is the dog that has never barked.

It would also be not that difficult an adjustment for the LP, which has expressed the view that it would like its ill-defined ‘soft Brexit’ outcome to be as close as possible to Single Market arrangements. That could also be crystallized in #rEEAmain, although internal opposition from its now dominant Remain tendency might make that pivot less easy than it potentially is for the LDs, and it appears to be impaired by misreadings of EEAA provisions on state aid, competition policy and freedom of movement on the part of its Marxist ideologues.

These two possibilities make things tricky for the CP. It cannot easily reach out to the unserved middle without risking substantial leakage of votes to the BXP and a revival of its ERG faction. The vanilla FTA in the Political Declaration is there to prevent those things happening: it points to ‘No Deal’ as those words are to be understood in the context of the future relationship, i.e. that there will be no ‘special closeness’ to the EU.

The CP’s one great strategic advantage, on which it almost inevitably has to focus, is that it is the only party in a position to respond to a more immediate, widely shared public desire to get Brexit (in the sense of withdrawal from the Treaty of Lisbon) ‘done and dusted’ as soon as possible. That advantage has just been reinforced by the BXP’s repositioning to a policy that calls for Brexit to be delayed, yet again, until 1 July 2020 (which looks like a mis-step, but opinion polls will soon confirm whether or not that is the case).

The CP’s weaknesses are that (a) the vanilla FTA prospect is publicly unpopular and (b) getting it done by end 2020 is a unicorn: it can be expected to take substantially longer than that. If it could achieve the victory it seeks on 12 December, it could, like the LDs easily pivot toward the centre of gravity of public attitudes (and, in their heart of hearts, ERG members know this, but will likely maintain their tradition of sub-ordinating realities to wishful thinking in the interim). But it is a big ‘if’, possibly conditional on whether the LDs and/or the LP seize the opportunities opened up to them by the existence of a hitherto unserved middle.

Those parties have not done so thus far, but there is nothing like a GE (when politicians do tend to need to give more consideration to voters’ opinions than is the norm) for concentrating minds. With contingency planning in mind, Number 10 might advisedly have a chat with one of the Government’s own Ministers, George Eustice, who has already thought these things through. (Hint: the EEA is a far better place to be, on almost all counts, than in a protracted transition (Limbo), and that is something that can be said early on, without abandoning the first priority of an eventual, new FTA.)

Brexit: the ‘Sunderland Option’

Below are to be found:

(a) the introduction to the paper “Brexit and the Single Market”, which was first published in July 2016. https://www.researchgate.net/publication/305721352_Brexit_and_the_Single_Market/link/579bc67808ae6a2882f1aa08/download 

(b) a section of the paper’s text located under the heading ‘Response speeds and asymmetries of power’, included now (28 July 2019) because of its potentially very high salience in current circumstances (on first publication it was scarcely noticed), followed by a few comments on the reasons for its suddenly elevated salience.

In my mind I think of it as the ‘Sunderland Option’ because it was conceived in the minutes following the Sunderland referendum declaration at around 00.20 on 24 June 2016.  The sentiment was: “OK, that’s a definitive answer from my home city and the task now is to figure out how to satisfy these aspirations in the best way possible.”

Looking back now (July 2019), the one big thing I would change in the text would be to make a very firm distinction between what are, in fact, two variants of the ‘Single Market’ that have subsequently been conflated: the EEA and the EU Internal Market. Speaking roughly, the former is aligned with Mrs Thatcher’s vision of the Single Market and the latter with Mr Delors’s vision.  The UK will exit the latter automatically upon withdrawing from the Treaty of Lisbon, but will not automatically exit the former.  The act of Brexit will therefore automatically separate the wheat from the chaff . The ‘Sunderland Option’ in effect says “hang on to the wheat: it will be of considerable value in seeing us through the next period”. 

 

Preface to “Brexit and the Single Market”

Summary of main points

The UK is currently a Contracting Party to the European Economic Area (EEA) Agreement, and exit from the EU does not necessarily imply exit from the Single Market (i.e. withdrawal from the Agreement). Exit from the EEA would require that extra steps be taken, either unilaterally by the UK or by the other Contracting Parties to the Agreement.

There is no explicit provision in the Agreement for the UK to cease to be a Contracting Party other than by unilateral, voluntary withdrawal, which requires simply the giving of twelve months’ notice in writing (Article 127). A commonly held assumption that only EU and EFTA members can be Parties to the EEA Agreement – and hence that the UK has to be a member of one or other of these two organisations to be in the Single Market – is not well grounded, although UK consideration of an application for EFTA membership is an option well worth exploring in its own right.

In the absence of a prior withdrawal notice or of steps by other Contracting Parties to try to force UK exit (of a nature not yet identified and not necessarily feasible in the light of the Vienna Conventions on the Law of Treaties and on Succession of States in respect of Treaties), on Day 1 of the post-Brexit era the default position appears to be that UK would still be a Party to the EEA Agreement.

This has major implications for any future negotiations. For example, with continuing UK participation there would be no requirement for an application for “access” to the Single Market.

Should the UK choose not to withdraw from the EEA there would be need for some textual adjustments to the Agreement, if only to reflect the UK’s changed status as a non-EU Contracting Party. The more substantive implications of continuing participation concern the operation of the institutions supporting the non-EU Contracting Parties – Iceland, Liechtenstein and Norway – not the EU institutions. Early discussions with the governments of these three countries are indicated: they need not await Article 50 Notification.

Continued participation in the EEA following Brexit would see substantial repatriation of powers covering the areas of agriculture, fisheries, trade policy, foreign and security policy, justice and home affairs, taxation, and immigration, consistent with the strong desire of many Leave voters to ‘take back control’. It would, for example, give the UK freedom to negotiate its own trade deals and set its own tariffs, as well as dispensing with the egregiously protectionist common agricultural policy.

Immigration is the most vexed issue, not least because of the difficulties in establishing a reasoned discourse on relevant matters. The underlying problem concerns the interpretation and application of the principle of free movement of persons, in respect of which EU political leaderships tend to favour a rather fundamentalist, ‘non-negotiable’ position, motivated by the goal of political union.

The EEA Agreement does not treat the ‘four freedoms’ (of goods, persons, services and capital) as absolutes. In each case it provides for limitations to be imposed when justified by some other aspect of public policy. Importantly, for non-EU Contracting Parties to the EEA Agreement the ‘decision maker of first instance’ is the relevant State, not the European Commission.

The commercial aim of the EEA Agreement affects the interpretation and application of the free movement principle in consequence of its significance when determining what measures can or can’t be justified. A ‘political’ interpretation and application of free movement of persons is not well adapted to the aim of the EEA Agreement set out in Article 1(1). Given this misalignment, free movement of persons is almost inevitably a highly contested issue.

Nevertheless, the Agreement provides scope for unilateral action on free movement of persons that is not currently possible for the UK as a member state of the EU. Post-Brexit the Agreement would allow scope for at least some degree of re-alignment of interpretation and application of the free movement principle to better fit with commercial policy objectives.

In relation to budgetary payments by the UK, the default position appears to be a zero contribution from Day 1 of the post-Brexit era, if the UK opts not to withdraw from the EEA Agreement. This again affects the negotiating position. If the UK subsequently agrees to make financial contributions it should expect a quid pro quo, for example increased influence in the rule-making process for the Single Market and/or more explicit recognition of greater flexibility in the interpretation and application of the free movement of persons principle (whilst still pledging allegiance to the principle itself). Such developments would also be of benefit to the other non-EU Contracting Parties, and arguably to EU Contracting Parties as well.

Crucially, the EEA Agreement does not foreclose future policy developments of the types suggested by those who favour immediate exit from the Single Market: it simply leaves those other options available for future consideration and possible adoption, allowing ‘market testing’ of new policy approaches in the interim. On this basis continued participation in the EEA Agreement can be said to be sufficient unto the day.

Such optionality coupled with the faster response speed of the UK governance system amounts to an asymmetric competitive advantage over the EU in policymaking, which serves to counteract the asymmetric disadvantage of smaller market size. The overall imbalance in power in Single Market rule-making is therefore somewhat less than it might appear at first sight.

The aim of the EEA Agreement (set out in Article 1(1)) is highly consistent with the longstanding aims of UK commercial policy, steady and dogged pursuit of which could be a stabilising factor that, inter alia, serves to reduce political uncertainty in markets.

 

From the section headed ‘Response speeds and asymmetries of power’ 

…  A second identifiable factor is the relative response speeds of different Contracting Parties in the face of unwanted conduct by others, i.e. the speed with which they can adjust their own policies in reaction to unwanted conduct by others. A fast response speed is a distinct advantage to whoever can command it. If the possessor is a dominant economic agent, it tends to reinforce the asymmetry of power and provide greater incentives for the use of that power; if the possessor is a weaker economic agent, it tends to mitigate the asymmetry of power and give rise to weaker incentives to use that power in the first place.

The UK’s governance arrangements can, when needed, be remarkably speedy by international standards: the system has what might be called a low inertia mode that can be switched on when circumstances dictate. In contrast, EU rule-making is a relatively cumbersome and slow process, which is unsurprising and difficult to avoid in a structure with so many members with differing interests. …

Comments

Game theory provides a conceptual framework for strategic thinking in the military arena as well as in economics and indeed one of key readings for my own postgraduate lectures in the 1980s used to be Thomas Schelling’s ‘The Strategy of Conflict’.  That highly readable book (with a central theme deriving from Odysseus and the Sirens) was concerned mostly with US cold war strategy, but the shared conceptual framework is indicated by the fact that Schelling was subsequently awarded the Nobel Prize in Economics.

In the military field the importance of differential response speeds in determining outcomes has been particularly emphasised by the US Air Force strategist John Boyd, who argued that a key strategic advantage flows from being able to create situations in which it is possible to make appropriate decisions more quickly than opponents. 

‘Brexit and the Single Market’ pointed out that such a situation was already established at the outset of Brexit negotiations: it didn’t need to be first created by other strategic manoeuvring and that fact allows for a great simplification.  To be turned to advantage though, it would be necessary first to ‘flick the low inertia switch’ (since in most circumstances UK governance sits in a high inertia mode). The tragedy of the last three years is that that was never done.  

There are two immediate reasons why these points are , quite suddenly, highly salient:

  1. With arrival of the new Government the switch has been flicked (arguably because it became necessary for the survival of the Conservative Party in anything like its current form).
  2. Dominic Cummings is deeply familiar with the work of John Boyd.

 

What does the UK want?

The governance system of the UK has suffered Brexit paralysis for 30 months in the face of a parent-to-child question “What do you want?” It has been asked many times over in the capitals of Europe, without any apparent, great success in eliciting a clear response. It is time to end the statis by testing out the preferences of MPs.

With a little bit of help from basic social science the initial exercise could be simple and very quick.  The aim is limited — to discover more about preference orderings — not to invent a voting mechanism that will be determinative in relation to major decisions.  It is the classic aim of regulatory impact assessment: to inform decisions, and to do no more than that.

When an issue is binary (Remain/Leave), a preference can be revealed by asking the subject to choose between the two options. When there are multiple options, preferences are discovered by asking for a multiplicity of binary choices to be made. A more comprehensive preference ordering is then built up from these binary comparisons.

That’s hard to do if there are myriad options available. But suppose there are only four, broad-brush options that require immediate assessment: A, B, C and D. There are then only six binary rankings/choices required: A vs B, A vs C, A vs D, B vs C, B vs D and C vs D. That entails entering a cross in six out of 12, paired boxes.

Such a questionnaire could fit on a sheet of A4, with plenty of white space, and is a lot less complex than a Californian ballot paper. The full dataset for the UK House of Commons can be coded in a 650 (approx.) x 6 spreadsheet, most cells having 0 or 1 entries. That is a small-scale data exercise.

Some of the returns may, of course, violate the axioms of rational choice theory (e.g. by showing ‘cyclicity’ or ‘non-transitivity’), but most human decision-making does that anyway (don’t panic: it just means that the theory fails and that should disturb no-one other than those invested in it). The data will be what they will be.

The four broad-brush Brexit options I would suggest are: (A) Remain in the EU, (B) ‘Norway’ (Leave the EU, but remain in the EEA), (c) a Stand-alone WA (‘stand-alone’ because it is the only framework Agreement sought), and (D) No Framework Agreement (‘No Deal’, but allowing for the possibility of specific agreements).

The stand-alone qualification is important because, for example, ‘Norway’ can be combined with a WA that covers matters not encompassed or not adequately addressed by the EEA Agreement itself, the most obvious of which are trade in agri-foods, trade in fish, and customs arrangements. Aspects of one approach can potentially be used to support or complement another, broad-brush approach.

The individual polling is not, however, a stand-alone exercise. The intent would be to form a view of opinions in the House of Commons as a whole. Knowledge of this intent may tempt respondents to play games with their individual responses.

MP X may prefer A to B to C, but, if B is thought to be the closest ‘competitor’ to A, he/she may be tempted to rank C above B in the binary choice between those two (less preferred) options, in hope of increasing the prospects for A at the collective level. We are all well familiar with this type of game playing from observing the Brexit process so far.

One defence against this is to make the dataset publicly available so that constituents, spreadsheet nuts, researchers, journalists et al can interrogate first the data and then the individual MP. In a representative democracy, MPs owe us their judgments for sure (Burke), but they also owe us some level of explanation for those judgments.

The prize in all this is an improved first mapping of individual and collective preferences, deliverable very quickly. There is a vote in the Commons next Tuesday. If the WA is voted down, an exercise like this could potentially be completed by the end of the week. If Government and Parliament won’t do it, a polling organisation or think tank could.

Will it work and be helpful? We don’t know. It is definitional that the outcomes of discovery processes are unknown: they don’t come with guarantees. We are already in ‘uncharted territory’, but, trained as a geographer, Mrs May should know that it might be a Good Thing to start charting it.

Conservative philosophy, then and now.

The Xmas and New Year break is a good time to catch up on background reading that has sat around on a ‘to do’ list for longer than it perhaps should have done. I’m currently about half way through a re-acquaintance, after several decades, with Edmund Burke’s Reflections on the Revolution in France, motivated by a passing thought that the current Parliament has seemed to exhibit some distinctly Jacobin tendencies.  There are many striking passages in what can reasonably be described as Burke’s long, sustained rant against the political developments in Paris toward the end of the 18th century, but the following passage had particular resonance for someone with strong interests in public policy.

“The science of constructing a commonwealth, or renovating it, or reforming it, is, like every other experimental science, not to be taught à priori. Nor is it a short experience that can instruct us in that practical science: because the real effects of moral causes are not always immediate; but that which in the first instance is prejudicial may be excellent in its remoter operation; and its excellence may arise even from the ill effects it produces in the beginning. The reverse also happens: and very plausible schemes, with very pleasing commencements, have often shameful and lamentable conclusions. In states there are often some obscure and almost latent causes, things which appear at first view of little moment, on which a very great part of its prosperity or adversity may most essentially depend. The science of government being therefore so practical in itself, and intended for such practical purposes, a matter which requires experience, and even more experience than any person can gain in his whole life, however sagacious and observing he may be, it is with infinite caution that any man ought to venture upon pulling down an edifice, which has answered in any tolerable degree for ages the common purposes of society, or on building it up again, without having models and patterns of approved utility before his eyes.”

The contrast with the sentiments of the current government is striking.  The referendum result gave an immediate mandate for extensive institutional demolition:  over 70% of the EU acquis was to go, including not just legislation focused chiefly on political integration, but also in areas with high economic salience such as the EU customs union and the common agricultural, fisheries and commercial policies.  EU legal supremacy was to be ended, and in future Britons would be governed by laws and regulations made exclusively by their home governments.

An immediate question for the Conservative Government was whether there should be a more comprehensive ‘pulling down’ than the extensive institutional demolition entailed by  the referendum result itself. Burke’s ‘infinite’ caution was no doubt a rhetorical exaggeration, but ‘proceed with considerable caution’ might reasonably have been expected to have entered the minds of traditional conservatives.  In the event, at the very outset of the Brexit process, a decision was taken to seek to pull down, in its entirety and not just in part, an edifice that has served a shared purpose to a ‘tolerable degree’ for some time.  It is the structure of European trading rules, including, but going well beyond, the tariff rules to be found in less deep Free Trade Agreements.  This institutional structure has functioned to reduce intra-European tariff and non-tariff trade barriers, covering approaching 50% of the UK’s goods trade.  It is commonly referred to as the Single Market [1].

This decision appears to have been taken by only three people (May, Hill and Timothy), exercising no caution whatsoever.  There was no thought-through assessment prior to the decision, and no realistically attainable ‘patterns of approved utility’ capable of ‘building it up again’ were considered in any depth.   Wishful thinking appears to have been judged sufficient:  “We know how to do this” is reported to have been the sentiment of the moment [2].  If so, that was Jacobin hubris, not Burkean prudence.

There is an interesting contrast here with that most economically radical of twentieth century Conservative Prime Ministers, Margaret Thatcher. The process of liberalization, privatization and regulatory reform associated with her name proceeded over the full 11+ years of her tenure of Downing Street, on a step-by-step basis, starting with easier policy exercises at its beginning and moving on to more difficult areas (like electricity and water) at its end (and even then not touching the yet more problematic cases of railways and postal services).  That allowed for sequential experimentation and  learning along the way.  At each stage an extensively considered new institutional structure was developed ahead of the abandonment of the old. There was no early equivalent of “we know how to do this”.  There was certainly boldness and innovation, but sagacity and prudence had not gone AWOL.

Despite the best efforts of the new Jacobins, the institutional edifice of the EEA Single Market  has not yet been destroyed.  The EEA acquis was established by means of an international Treaty, the EEA Agreement, which the UK made a solemn promise to observe when it first signed the Agreement (in Porto on 2 May 1992) and then ratified “according to [its own] constitutional requirements”.  As the Attorney General recently advised Cabinet colleagues in the context of the Ireland / Northern Ireland Protocol to the Withdrawal Agreement (another international Treaty), under international law such commitments do not simply melt away.  Concrete actions are required to end them.

It may be that the Government is now hoping to evade its EEA Treaty obligations by means of the Agreement just struck with Iceland, Liechtenstein and Norway concerning arrangements designed to follow a future, currently hypothetical, UK withdrawal from the EEA. The draft Agreement was published surreptitiously on 20 December 2018, possibly with the intent that, at a future moment of choice, it will serve as an agreement that supersedes the EEA, rendering parliamentary consent to EEA withdrawal otiose.  However, the new Agreement must also be ratified (see its Article 71(1)) and, if Parliamentarians smell a rat, they can forestall any such subterfuge.  Parliament therefore has the power to ensure that UK membership of the EEA is not ‘pulled down’ before the means to build it (or something similarly functional) up again are very firmly before its eyes.  A Jacobin government versus a Burkean parliament?  That would be an interesting conjunction.

 

[1]  There are in fact two ‘Single Markets’:  the EU’s  Internal Market and the EEA’s Single Market, each defined by a distinct international Treaty.  The former has a wider policy scope, including the EU customs union and common commercial policy, agriculture, and fisheries, which are not covered by the EEA Agreement.  The two, market governance structures are also substantially different.  Internal Market rule-making includes majority voting procedures (consent to which Mrs Thatcher came later to regret), and a single supervisory system based on the European Commission and the CJEU. In contrast, the EEA rule-making and supervisory system is dualistic in nature, relying on consensual agreement and reserving a right of veto for each of the non-EU States.

[2]  See Stewart Wood (2017), https://medium.com/@stewartwood6887/theresa-mays-mistaken-precedent-for-a-brexit-based-on-cherry-picking-1e2e6a3b9985 and Tim Shipman (2017), Fall Out (page 5).