The United Kingdom is at a crossroads, an ideological battle between natural science and behavioral science. Let’s hope for all our sakes we get this one right.
Boris Johnson, the UK Prime Minister, is facing a dilemma. When do we go from the so-called containment phase for controlling Covid-19 Coronavirus, to the delay phase.
In the medical / natural science corner, is the Chief Medical Officer, Professor Chris Whitty, who has presented himself calmly, reassuringly, as completely on top of his brief. He is a physician and an epidemiologist (as well as a lawyer, and an MBA). His evidence at the newly formed House of Commons Coronavirus Committee was calm, frank, precise. He is exactly the sort of advisor that any government would be proud to have. Flatten the peak. Delay the virus spread. Keep the height of the peak low. Save lives.
In the behavioral science corner is, well, I am not sure who. Maybe it’s the Chief Scientific Advisor, who highlighted the need to take account of behavioral science. Yes, please do. It’s a wicked problem, and please include more complex social modelling.
But what we are now seeing is what the Director General of the World Health Organization (up until now also criticised for its seemingly political response to the issue) could be referring to as ‘alarming levels of inaction’.
I do hope, however, that Boris Johnson is being guided by the science, both behavioral and epidemiological, and not by advisors who profess to be superforecasters. You don’t have to be a superforecaster to forecast that if we get this wrong, many will die unnecessarily.
The 2011 film Contagion, starring the spectacularly ill-fated Gwyneth Paltrow, is a dramatization of a viral pandemic starting in pretty analagous circumstances to the current Wuhan Coronavirus (2019-nCoV) outbreak. It’s a good film, and is a great introduction to the work of Centers for Disease Control (CDCs) that monitor the spread – the epidemiology – of the disease. There are two scenes where R-nought, or R0, are described:
Despite the blogger character in the clip describing the spread, using a R0 of 2, as being a problem you can do on a napkin, it takes a little more thinking about. He also seems a bit confused about R0, talking about growth from 2 to 4 to 16, to 256, to 65,536 each day. That’s not what R0 is – it is not a rate, and actually if the rate was 2, this would mean 2 to 4 to 8 to 16 to 32 etc., each time doubling the number. It is possible that he is thinking that there are two generations each per day, but that’s not whatR0 is.
So, on to the professionals:
The CDC epidemiologist in the clip is more on point (despite having sloppy notation with no subscripts). This is better – it shows the reproduction number for the infection – note again, this is not a rate – no time dimension is involved – it basically shows the number of cases on average each case generates.
This population modelling – so called SIR (Susceptible, Infected, Recovered) system dynamics modelling – is just one of several approaches that can be used to model contagion across a population. My recent paper ‘Spatial Transmission Models: A Taxonomy and Framework’ sets out a review of what they are and the advantages and disadvantages of each. In brief, we can model the population numbers, the individual agents that carry the virus, the network of contacts between infected individuals, or the regions or cells in which individuals are located (city districts, for example). The paper is available to read by clicking on the link here.
members of Congregation. I look around this room and see privilege. Every one of us here in the Sheldonian
Theatre is privileged; every member of Congregation reading the Gazette is privileged. We are privileged not by our past but by our
present: we all have the power to share in the democratic self-governance of
the institution that is the Collegiate University of Oxford.
democratic self-governance is hard. It
is time-consuming and troublesome, and is most easily left to specialists. Specialists with a track record of delivering
strategic plans at high speed.
Vice Chancellor warned us of the dangers of high speed in her 2017 Oration, and
I quote: Over 2,000 years ago Tacitus
pointed out that ‘Truth is confirmed by inspection and delay; falsehood by
haste and uncertainty’.
is tempting to react quickly to short term opportunities in order to gain transient
rewards, but this is, as my strategic management colleagues will confirm, often
at the expense of more attractive opportunities foregone. We must, at the very least, be able to give ad hoc proposals the service of being
fully inspected. The proposal to establish
a new Society – or is it a College? – is a significant one, particularly when
it is to have a distinctive culture as was the case with Templeton College
reason that an ‘education priority’ within the Strategic Plan has abruptly
become a press release announcing Parks College, without the knowledge of
Congregation, is that such proposals are now increasingly made without such scrutiny. While the Strategic Plan was put to
Congregation for approval, the Implementation Plan referred to within the
Strategic Plan was not. This ‘Plan
within a Plan’ is administered by Programme Boards whose agenda and minutes are
secret. In short, Congregation does not
know what is going on, and its ability to give informed consent is subverted.
of the strengths of Oxford that sets it apart from its ‘competitors’ is its
self-governance. This has allowed the
University to evolve and adapt to a changing environment, and mercifully not be
suffocated by the latest management fads and fashions. It is bewildering that ‘Senior Managers’ do
not appear to recognize the capabilities available to them within Congregation,
preferring to operate in a more comfortable ‘command and control’, top-down
fashion. If strategy is imposed, we as a
University lose the ability to adapt and to take advantage of opportunities
that may emerge – opportunities that may not be visible from the board room but
are visible from the diversity of perspectives that each one of us holds as a
unique member of Congregation.
combined organizational capabilities of Congregation – all members of
Congregation, experts in their own fields whatever they may be – are truly awe-inspiring. It is not always easy to find consensus, but
that does not mean that this University should give up and follow the lowest
common denominator of managerial hubris.
must be allowed to review and guide the Legislative Proposal to create Parks
College prior to giving its approval. The Strategic Plan spoke of creating a
new College by 2023, not a new Society in 2019.
Nolan Committee on Standards in Public Life was established 25 years ago. The principles of openness and accountability
which it set out are as relevant now as they have ever been. I urge you to vote against the Legislative
proposal while we still have the right to exercise that privilege.
Wednesday’s indicative votes in the House of Commons produced no definitive answer of the way forward. By using social network analysis showing the size of each voting bloc and ‘Hamming distances’ (ironically usually used for error correction), we can map how close MPs are to each other, giving an indication of how a coalition could be made if each block of MPs flipped their vote in order to form a Parliamentary consensus.
Brexit is currently turning out to a failed experiment in direct democracy, something I pointed out nearly three years ago.
However, with the House of Commons opening up data, it does allow us a rare insight into the goings on of the population and the MPs that are our representatives.
One interesting data source is released by the UK Parliament showing the voting record of every MP for every ‘division’ (vote). One particularly interesting vote was that done on Wednesday 27 March, where MPs were able to cast their votes for 8 motions:
By making a so-called bipartite network, we can map individual MPs to the votes for which they voted yes. This results in a map of MPs shown below.
While this is interesting, it doesn’t really show the distance between MPs’ voting intentions.
We can redraw the map by using the distance between MPs according to the votes they cast. We can do this by constructing a binary string of their votes. For simplicity, we count only the ‘aye’ or yes votes, and ignore abstentions and nos.
For instance, if an MP voted yes, no, no, yes, no, yes, yes, no, they would be given a string of 10010110, whereas if another MP voted no, no, yes, no, yes, yes, no, yes, they would be given a string of 00101101. So, what is the ‘distance’ between 10010110 and 00101101? For this, we use the Hamming distance – count the number of locations where there is a difference. In this case, the Hamming distance between the MPs is 6.
By constructing a graph of Hamming distances of 1, we can construct neighbours of individual groups of MPs. This is shown in the graph below.
I have listed the votes in the following order:
Common Market 2.0 Confirmatory Public Vote Contingent Preferential Arrangement Customs union EFTA and EEA Labour’s Alternative Plan No Deal Revocation to Avoid No Deal
However, this isn’t very useful, as it doesn’t show the type of MP that voted for each of these. So we can relabel the nodes with a representative MP from that bloc.
From this, you can work out the number of intermediate MPs to get to any other MPs. What is quite interesting is that every MP was just one vote away from another – no-one is isolated. Which, in some little way, gives us hope.
We can then weight the edges to show the possible coalition that could be made if these blocs were to join. And here it is:
The size of each circle represents the number of MPs that voted the same way as the representative MP named on the circle, and the thickness of the links shows how many MPs would join together if one vote were flipped.
If the linked blocs join up, you can see how there could be a path to a Parliamentary majority – for the blocs to join, it would mean switching one vote from ‘aye’ to ‘no’ or vice versa.
For completeness, the list of MPs and their associated binary string is linked here. You can find the MPs that are part of each bloc by searching for the MP name in the label on the network graph. The Hamming distance between each and every MP is available on request. I leave it to the reader to construct an affinity matrix – or what I would call currently describe as a ‘Matrix of Hate’ for each MP pair.
Right now, Houston is going through one of the most severe storms ever to hit the USA. The main conversation on today’s news was whether the Mayor (who has authority to do such things) should have evacuated the City prior to the arrival of Hurricane Harvey.
For a start, NOAA did not forecast a direct hit on the City. But it was forecast that potentially devastating rains were on the way.
Houston has been here before, of course, in 2005 when the then Mayor did order that the city be evacuated. And around 100 died, as a result of the gridlock and heat.
But let’s think about what an uncontrolled evacuation of Houston would mean.
While there is, of course, a Houston evacuation plan, assuming you want to avoid the Gulf of Mexico, the main routes are via the north and west: I69 to the north-east, I45 to the north, US Route 290 to the north-west, and I10 to the west.
Now let’s consider the capacity of these roads. The capacity of roads in the US is given by the Department of Transportation’s Highway Capacity Manual. While there is a whole science devoted to calculating freeway flow measurements, you need to take into account not only the capacity of the road (the number of cars), but also their speed. Combining these gives us a flow rate, i.e. the number of cars that will pass a point in a particular length of time. We can look at the academic literature to see what this is. Dixit and Wolshon (2014) have a nice study where they looked at maximum evacuation flow rates. Their Table 2 shows the empirical data, but it’s around 1,000 vehicles per hour per lane. Assuming the Houston metro system evacuation routes of the north and west are around 4 x 4 lanes. Give a factor of 1.5 for contraflows, and you have around 25 lanes. So that’s 25 x 1000 = 25,000 vehicles per hour. And let’s assume an occupancy of 4 passengers per vehicle (i.e. most would evacuate by car). So that’s 100,000 passengers per hour.
The problem with Houston is that it’s the USA’s fourth largest city. And that means it’s big. It (Greater Houston) has a population of 6.5 million. So that means 6.5 million / 100,000 = 65 hours. Non stop, day and night. Without accidents. A very bold move for a hurricane that was not due to hit directly.
I am looking for high quality, numerate, candidates to fill these exciting PhD studentships with me as a (co-) supervisor at Loughborough’s School of Business and Economics. Please note that this post has been updated with new links (in blue, below).
The first is modelling dynamic responses to dynamic threats; the second is using analytics in traditional industries. Please see the links below for further details and how to apply. Note that for further information, you will need to click on the blue links below.
One of the most challenging issues for policy makers dealing with bio-security threats is their dynamic nature: diseases may spread quickly and deadly among vulnerable populations and pandemics may cause many casualties.
Finding the appropriate response to threats is a major challenge. Whilst models exist for understanding of the dynamics of the threats themselves, responses can be largely ad-hoc or ‘firefighting’. The aim of this research is to produce robust responses for dynamic threats.
The research will build up as follows, from low to high complexity: static responses to static threats; static responses to dynamic threats; dynamic responses to static threats; and dynamic responses to dynamic threats.
We will use a variety of methods to define the best response: cellular automata, network analysis, spatial modelling, agent-based modelling, and the generation of dynamic fitness landscapes.
This PhD studentship is most suitable for candidates with a background in a quantitative discipline such as management science, operations research, engineering, physics and other natural sciences.
The rise of business analytics has given rise to enormous opportunities within the private sector, but these benefits have yet to be fully realized in public services and regulated industries such as energy, water, and transportation networks. Conversely, governments are mandating collection of data by installing smart metering devices. This gives rise to the need for innovative ways of thinking in industries that are still largely based on traditional economic thinking involving conventional assumptions on optimization and behaviour.
As an example, the energy sector is characterised by strongly defined market structures with incumbents and an ultimate need for energy network security, which not only prevents the quick adoption of technical changes but also translates into regulatory outcomes, such as price caps.
This exciting PhD opportunity will integrate theoretical and empirical approaches and spans two strengths of Loughborough’s School of Business and Economics: microeconomics and particularly rigorous analysis of the determinants of productivity and performance (including cost modelling) and management science (including simulation and network analysis).
We are therefore seeking a student with a quantitative background (whether in economics, management science, engineering, physics or other natural sciences). A willingness to learn new techniques such as, cost modelling, performance measurement, agent-based modelling and network analysis is desired.
We will introduce a single capital floor, set at £100,000, more than four times the current means test threshold. This will ensure that, no matter how large the cost of care turns out to be, people will always retain at least £100,000 of their savings and assets, including value in the family home.
A quick calculation on the effective ‘Tax’ (Care Fees as a Percentage of Initial Wealth) shows the following distribution of Tax Rates. On the x axis is initial wealth (the value of your house plus any savings), and on the y axis is the Tax Rate. The Conservative Party have since augmented this plan with a proposed cap (consultation to come).
Values used for Care Fees: £20,000, £40,000, £60,000, £80,000; £100,000. Values used for Initial Wealth: £0, £100,000, …, £1,000,000. The trend continues downwards after this figure.
As many have pointed out, this affects individuals with initial wealth just over £100,000 proportionately far more than those with higher initial wealth. More detail on policy options for funding social care can be found in the Dilnot Commission report and a summary of their proposals is shown below:
The ‘℮’ symbol, or the ‘e-mark’ is a symbol you will see on packaging such as tins or packets in Europe. Millions of us will see this symbol every day, but what does it actually mean?
The raison d’etre for the e-mark comes from the problem of selling goods to the public. We would all like to think that we are getting what we pay for, but does that mean we should always get what we pay for?
Well, if you use the ℮-mark, then no. And yes if you don’t. So you use the ℮-mark. By doing so, some of us are short-changed, but, on average, we shouldn’t be.
The e-mark was introduced in 1976 by the legislation known by the snappy title of ‘Council Directive 76/211/EEC of 20 January 1976 on the approximation of the laws of the Member States relating to the making-up by weight or by volume of certain prepackaged products’.
This sets out a nominal value of a product. This means that, on average, we should not receive less than the value stated before the e-mark. But we would be really annoyed if we received, say, nothing, and someone else received twice the nominal amount. So, the concept of tolerable negative error was introduced at the same time, to set out the minimum legal amount that each packet or tin or container should contain. The idea is that only a few containers can weigh less than the declared value less the tolerable negative error (but none can be twice the tolerable negative error… that would be, well, intolerable).
In packets from 5 grams to 10 kilogrammes, the tolerable negative error varies from 9% (quite a lot) to 1.5% (not such a lot), the rationale being that it is easier to measure larger values with greater accuracy.
Eight Mile, epitomized by Eminem in the film of the same name, is a street in Detroit that marks the boundary between the majority white northern suburbs and the majority black neighborhoods closer to the inner city.
But what causes this segregation in the first place?
Hypothesis 1: The Central Planner
In Detroit’s case, as with many cities across the USA, it was, in part, due to the zoning of the city by the federal Home Owners’ Loan Corporation, which zoned the city into areas of risk, meaning that banks were indirectly encouraged to develop outer suburbs while not offering mortgages to inner city properties. This lead to wealthier, generally white, residents moving to the suburbs.
Indeed, physical barriers, such at the Detroit Wall, also known as the Eight Mile Wall, were built to separate majority black and majority white neighborhoods.
The legacy of these zones live on today, as seen in the map below from the 2010 US Census. The dividing line between the green (black) areas and the blue (white) areas is Eight Mile Road.
So, segregation exists, and is caused by a central actor. But is there an alternative explanation?
Alternative Hypothesis: Emergence
In 1971, Thomas Schelling set out to model the phenomenon, not by assuming a central planner, but by modelling the interactions of individuals.
Thomas Schelling’s model was this. Assume individuals are placed in a grid, similar to being located on a chess board. Allow individuals who are in a local minority to move. In the example below, the blue circle is in a minority (with 5 out of its 6 neighbors being a different color), and according to the rules of the model, is unhappy. It could decide to move to the vacant square to the north-west, but it would still be in a local minority (with 4 out of 6 neighbors being a different color) and would remain unhappy. So instead, it chooses the space to the south west where 3 out of its 6 neighbors are of the same color, and not being in a minority, it settles there.
Schelling, perhaps without knowing it, introduced agent-based modelling, where, instead of modelling the system as a whole, the modelling of individual agents enables us to see the emergence of macro-level properties, in this case segregation, via the modelling of micro-level (local) interactions.
We can see the effect of micro-level interactions causing macro-level segregation in the model below (developed by Duncan Robertson after Wilensky after Schelling). Each individual, or agent, decides whether they are unhappy or happy; if they are unhappy, they search until they find a vacant location where they will become happy. This continues until all individuals attain happiness.
So, perhaps segregation is not imposed, but is down to us. Or maybe, in reality, it’s a little bit of both.
Please do get in touch if you would like to discuss building or working with agent-based models.