Social Distancing using Play-Doh and Matches

There has been much talk about using social distancing as a response to the Coronavirus outbreak. Here’s a video of how social distancing can be used:

  • Close Contact, where the population is infected very rapidly. This is not A Good Thing – the national health service is overwhelmed, and the death rate goes up as a result
  • Extreme Distancing, where the viral spread takes too long. This is not A Good Thing – the social and economic costs are huge
  • Optimal Distancing, where the spread is controlled and the human, social, and economic costs are optimized. This is A Good Thing. It is also very tricky to get right.

Coronavirus: The First Big Test of Behavioral Science

The United Kingdom is at a crossroads, an ideological battle between natural science and behavioral science. Let’s hope for all our sakes we get this one right.

Boris Johnson, the UK Prime Minister, is facing a dilemma. When do we go from the so-called containment phase for controlling Covid-19 Coronavirus, to the delay phase.

In the medical / natural science corner, is the Chief Medical Officer, Professor Chris Whitty, who has presented himself calmly, reassuringly, as completely on top of his brief. He is a physician and an epidemiologist (as well as a lawyer, and an MBA). His evidence at the newly formed House of Commons Coronavirus Committee was calm, frank, precise. He is exactly the sort of advisor that any government would be proud to have. Flatten the peak. Delay the virus spread. Keep the height of the peak low. Save lives.

Mitigation efforts like social distancing help reduce the disease caseload on any given date, and can keep the healthcare system from becoming overwhelmed.
Image: New York Times adapted from CDC/Economist https://www.nytimes.com/2020/03/11/science/coronavirus-curve-mitigation-infection.html

In the behavioral science corner is, well, I am not sure who. Maybe it’s the Chief Scientific Advisor, who highlighted the need to take account of behavioral science. Yes, please do. It’s a wicked problem, and please include more complex social modelling.

But what we are now seeing is what the Director General of the World Health Organization (up until now also criticised for its seemingly political response to the issue) could be referring to as ‘alarming levels of inaction’.

I do hope, however, that Boris Johnson is being guided by the science, both behavioral and epidemiological, and not by advisors who profess to be superforecasters. You don’t have to be a superforecaster to forecast that if we get this wrong, many will die unnecessarily.

Techincal Note: Calculating Hamming Distances Between Two Binary Strings in Excel

I haven’t seen anywhere that does this, so here’s how to calculate the Hamming distance between two binary strings in excel.

3-bit binary cube Hamming distance examples
Credit: Wikipedia / en:User:Cburnett

Say you have two binary strings, say 001001 and 100100. How do you calculate their Hamming distance? It turns out it isn’t that easy in Excel, but is possible.

How do we calculate the Hamming distance between 100 and 011 in the cube above, shown as the shortest (red) line joining these two points?

To cut to the chase, the formula for calculating the Hamming distance between strings in cell A1 and B1 is:

=LEN(B1) - LEN(SUBSTITUTE(DEC2BIN(BITXOR(BIN2DEC(B1), BIN2DEC(A1)), LEN(B1)), "1", ""))

BITXOR(A1, B1) sums the difference in bits between the strings in A1 and B1 according to Exclusive OR XOR logic:

INPUTSOUTPUT
ABA XOR B
000
011
101
110

However, Excel needs these numbers as decimals, and the BIN2DEC function converts these binary strings to decimals.

And finally, a trick for counting the occurances of a character in a string:

LEN(A1)-LEN(SUBSTITUTE(A1,”1″,””)) counts the number of 1’s in the string.

As I say, quite a technical note, and I hope useful to people looking at Hamming distances.

Agent-Based Strategizing: New Book Published at Cambridge University Press

My new book, Agent-Based Strategizing, has been published at Cambridge University Press. It is available to download for free until 31 July 2019 at the link below. The book is an overview of how agent-based modelling has been (and can be) used in strategic management.

https://www.cambridge.org/core/elements/agentbased-strategizing/4AD9D0D7416DE46AEB7F1A5478772ACF

Abstract: Strategic management is a system of continual disequilibrium, with firms in a continual struggle for competitive advantage and relative fitness. Models that are dynamic in nature are required if we are to really understand the complex notion of sustainable competitive advantage. New tools are required to tackle challenges of how firms should compete in environments characterized by both exogeneous shocks and intense endogenous competition. Agent-based modelling of firms’ strategies offers an alternative analytical approach, where individual firm or component parts of a firm are modelled, each with their own strategy. Where traditional models can assume homogeneity of actors, agent-based models simulate each firm individually. This allows experimentation of strategic moves, which is particularly important where reactions to strategic moves are non-trivial. This Element introduces agent-based models and their use within management, reviews the influential NK suite of models, and offers an agenda for the development of agent-based models in strategic management.

Spatial Transmission Models: A Taxonomy and Framework

Risk Analysis

This paper , published in the journal Risk Analysis, sets out a review of the different methods used for modelling the spread of an idea, disease, etc. over space.

ABSTRACT

Within risk analysis and more broadly, the decision behind the choice of which modelling technique to use to study the spread of disease, epidemics, fires, technology, rumors, or more generally spatial dynamics, is not well documented.

While individual models are well defined and the modeling techniques are well understood by practitioners, there is little deliberate choice made as to the type of model to be used, with modelers using techniques that are well accepted in the field, sometimes with little thought as to whether alternative modelling techniques could or should be used.

In this paper, we divide modelling techniques for spatial transmission into four main categories: population-level models, where a macro-level estimate of the infected population is required; cellular models, where the transmission takes place between connected domains, but is restricted to a fixed topology of neighboring cells; network models, where host-to-host transmission routes are modelled, either as planar spatial graphs or where short cuts can take place as in social networks; and finally agent-based models which model the local transmission between agents, either as host-to-host geographical contacts, or by modelling the movement of the disease vector, with dynamic movement of hosts and vectors possible, on a Euclidian space or a more complex space deformed by the existence of information about the topology of the landscape using GIS techniques. We summarize these techniques by introducing a taxonomy classifying these modeling approaches.

Finally, we present a framework for choosing the most appropriate spatial modelling method, highlighting the links between seemingly disparate methodologies, bearing in mind that the choice of technique rests with the subject expert.

PhD Studentship in Modelling Dynamic Responses to Dynamic Threats at Loughborough University [applications now closed]

I am co-supervising the following PhD project – the application link and further details can be found here: http://www.lboro.ac.uk/study/postgraduate/research-degrees/funded/modelling-dynamic-responses/ .  The closing date is 14 December 2017.  Please get in touch if you would like to discuss this opportunity.

One of the most challenging issues for policy makers dealing with bio-security threats is their dynamic nature: diseases may spread quickly and deadly among vulnerable populations and pandemics may cause many casualties.

Finding the appropriate response to threats is a major challenge. Whilst models exist for understanding of the dynamics of the threats themselves, responses can be largely ad-hoc or ‘firefighting’. The aim of this research is to produce robust responses for dynamic threats.

The research will build up as follows, from low to high complexity: static responses to static threats; static responses to dynamic threats; dynamic responses to static threats; and dynamic responses to dynamic threats.

We will use a variety of methods to define the best response: cellular automata, network analysis, spatial modelling, agent-based modelling, and the generation of dynamic fitness landscapes.

This PhD studentship is most suitable for candidates with a background in a quantitative discipline such as management science, operations research, engineering, physics and other natural sciences.

Why The Mayor of Houston Was Right to Not Evacuate the City

Right now, Houston is going through one of the most severe storms ever to hit the USA.  The main conversation on today’s news was whether the Mayor (who has authority to do such things) should have evacuated the City prior to the arrival of Hurricane Harvey.

For a start, NOAA did not forecast a direct hit on the City.  But it was forecast that potentially devastating rains were on the way.

Houston has been here before, of course, in 2005 when the then Mayor did order that the city be evacuated.  And around 100 died, as a result of the gridlock and heat.

But let’s think about what an uncontrolled evacuation of Houston would mean.

This is the map of the Houston highway system.

And here is is on Google maps.

 

While there is, of course, a Houston evacuation plan, assuming you want to avoid the Gulf of Mexico, the main routes are via the north and west: I69 to the north-east, I45 to the north, US Route 290 to the north-west, and I10 to the west.

Now let’s consider the capacity of these roads.  The capacity of roads in the US is given by the Department of Transportation’s Highway Capacity Manual.  While there is a whole science devoted to calculating freeway flow measurements, you need to take into account not only the capacity of the road (the number of cars), but also their speed.  Combining these gives us a flow rate, i.e. the number of cars that will pass a point in a particular length of time.  We can look at the academic literature to see what this is.  Dixit and Wolshon (2014) have a nice study where they looked at maximum evacuation flow rates.  Their Table 2 shows the empirical data, but it’s around 1,000 vehicles per hour per lane.  Assuming the Houston metro system evacuation routes of the north and west are around 4 x 4 lanes.  Give a factor of 1.5 for contraflows, and you have around 25 lanes.  So that’s 25 x 1000 = 25,000 vehicles per hour.  And let’s assume an occupancy of 4 passengers per vehicle (i.e. most would evacuate by car).  So that’s 100,000 passengers per hour.

The problem with Houston is that it’s the USA’s fourth largest city.  And that means it’s big.  It (Greater Houston) has a population of 6.5 million.  So that means 6.5 million / 100,000 = 65 hours.  Non stop, day and night.  Without accidents.  A very bold move for a hurricane that was not due to hit directly.

And tropical storm watches are only typically issued 48 hours before winds are due to strike.

By not evacuating, resources are kept in Houston rather than being disseminated across the locations of incidents caused by evacuating traffic.

The real test however comes in the days and months ahead, where the process of rescue, recovery, and rebuilding is critical.

References

Dixit, V. and Wolshon, B. (2014) ‘Evacuation Traffic Dynamics’, Transportation Research Part C, 49, 114-125

 

Two Funded PhD Studentships in Agent-Based Modelling at Loughborough University School of Business and Economics [applications now closed]

I am looking for high quality, numerate, candidates to fill these exciting PhD studentships with me as a (co-) supervisor at Loughborough’s School of Business and Economics.  Please note that this post has been updated with new links (in blue, below).

The first is modelling dynamic responses to dynamic threats; the second is using analytics in traditional industries.  Please see the links below for further details and how to apply.  Note that for further information, you will need to click on the blue links below.

Modelling Dynamic Responses to Dynamic Threats (with Professor Gilberto Montibeller)

One of the most challenging issues for policy makers dealing with bio-security threats is their dynamic nature: diseases may spread quickly and deadly among vulnerable populations and pandemics may cause many casualties.

Finding the appropriate response to threats is a major challenge.  Whilst models exist for understanding of the dynamics of the threats themselves, responses can be largely ad-hoc or ‘firefighting’.  The aim of this research is to produce robust responses for dynamic threats.

The research will build up as follows, from low to high complexity: static responses to static threats; static responses to dynamic threats; dynamic responses to static threats; and dynamic responses to dynamic threats.

We will use a variety of methods to define the best response: cellular automata, network analysis, spatial modelling, agent-based modelling, and the generation of dynamic fitness landscapes.

This PhD studentship is most suitable for candidates with a background in a quantitative discipline such as management science, operations research, engineering, physics and other natural sciences.

Business Analytics for Public Services and Regulated Industries: New Techniques for Analytics-Driven Decision Making in Traditional Industries (with Dr Maria Neiswand and Professor David Saal)

The rise of business analytics has given rise to enormous opportunities within the private sector, but these benefits have yet to be fully realized in public services and regulated industries such as energy, water, and transportation networks. Conversely, governments are mandating collection of data by installing smart metering devices. This gives rise to the need for innovative ways of thinking in industries that are still largely based on traditional economic thinking involving conventional assumptions on optimization and behaviour.

As an example, the energy sector is characterised by strongly defined market structures with incumbents and an ultimate need for energy network security, which not only prevents the quick adoption of technical changes but also translates into regulatory outcomes, such as price caps.

This exciting PhD opportunity will integrate theoretical and empirical approaches and spans two strengths of Loughborough’s School of Business and Economics: microeconomics and particularly rigorous analysis of the determinants of productivity and performance (including cost modelling) and management science (including simulation and network analysis).

We are therefore seeking a student with a quantitative background (whether in economics, management science, engineering, physics or other natural sciences). A willingness to learn new techniques such as, cost modelling, performance measurement, agent-based modelling and network analysis is desired.