PhD Studentship in Modelling Dynamic Responses to Dynamic Threats at Loughborough University

I am co-supervising the following PhD project – the application link and further details can be found here: http://www.lboro.ac.uk/study/postgraduate/research-degrees/funded/modelling-dynamic-responses/ .  The closing date is 14 December 2017.  Please get in touch if you would like to discuss this opportunity.

One of the most challenging issues for policy makers dealing with bio-security threats is their dynamic nature: diseases may spread quickly and deadly among vulnerable populations and pandemics may cause many casualties.

Finding the appropriate response to threats is a major challenge. Whilst models exist for understanding of the dynamics of the threats themselves, responses can be largely ad-hoc or ‘firefighting’. The aim of this research is to produce robust responses for dynamic threats.

The research will build up as follows, from low to high complexity: static responses to static threats; static responses to dynamic threats; dynamic responses to static threats; and dynamic responses to dynamic threats.

We will use a variety of methods to define the best response: cellular automata, network analysis, spatial modelling, agent-based modelling, and the generation of dynamic fitness landscapes.

This PhD studentship is most suitable for candidates with a background in a quantitative discipline such as management science, operations research, engineering, physics and other natural sciences.

Why The Mayor of Houston Was Right to Not Evacuate the City

Right now, Houston is going through one of the most severe storms ever to hit the USA.  The main conversation on today’s news was whether the Mayor (who has authority to do such things) should have evacuated the City prior to the arrival of Hurricane Harvey.

For a start, NOAA did not forecast a direct hit on the City.  But it was forecast that potentially devastating rains were on the way.

Houston has been here before, of course, in 2005 when the then Mayor did order that the city be evacuated.  And around 100 died, as a result of the gridlock and heat.

But let’s think about what an uncontrolled evacuation of Houston would mean.

This is the map of the Houston highway system.

And here is is on Google maps.

 

While there is, of course, a Houston evacuation plan, assuming you want to avoid the Gulf of Mexico, the main routes are via the north and west: I69 to the north-east, I45 to the north, US Route 290 to the north-west, and I10 to the west.

Now let’s consider the capacity of these roads.  The capacity of roads in the US is given by the Department of Transportation’s Highway Capacity Manual.  While there is a whole science devoted to calculating freeway flow measurements, you need to take into account not only the capacity of the road (the number of cars), but also their speed.  Combining these gives us a flow rate, i.e. the number of cars that will pass a point in a particular length of time.  We can look at the academic literature to see what this is.  Dixit and Wolshon (2014) have a nice study where they looked at maximum evacuation flow rates.  Their Table 2 shows the empirical data, but it’s around 1,000 vehicles per hour per lane.  Assuming the Houston metro system evacuation routes of the north and west are around 4 x 4 lanes.  Give a factor of 1.5 for contraflows, and you have around 25 lanes.  So that’s 25 x 1000 = 25,000 vehicles per hour.  And let’s assume an occupancy of 4 passengers per vehicle (i.e. most would evacuate by car).  So that’s 100,000 passengers per hour.

The problem with Houston is that it’s the USA’s fourth largest city.  And that means it’s big.  It (Greater Houston) has a population of 6.5 million.  So that means 6.5 million / 100,000 = 65 hours.  Non stop, day and night.  Without accidents.  A very bold move for a hurricane that was not due to hit directly.

And tropical storm watches are only typically issued 48 hours before winds are due to strike.

By not evacuating, resources are kept in Houston rather than being disseminated across the locations of incidents caused by evacuating traffic.

The real test however comes in the days and months ahead, where the process of rescue, recovery, and rebuilding is critical.

References

Dixit, V. and Wolshon, B. (2014) ‘Evacuation Traffic Dynamics’, Transportation Research Part C, 49, 114-125

 

The Most Competitive Airline Routes in the World

I am using airline data to construct a network of competition in the airline industry.  As part of this, I am listing the routes that are the most competitive – not necessarily the ones that have the most flights, but the ones that have the most competitors.

And here they are

Map generated from GCMap.com

HKG-ICN  Hong Kong – Incheon, Seoul (South Korea)

Not shown: EastarJet

TPE-NRT  Taipei (Taiwan) – Narita, Tokyo (Japan)

Not shown: Vanilla Air, Tiger Air, Scoot, Transasia

SIN-CGK   Singapore – Jakarta (Indonesia)

Not shown: Indonesian Air Asia, Scoot, JetStar Asia

SIN-DPS   Singapore – Denpasar, Bali (Indonesia)

Not shown: Qantas, Qatar, Indonesia Air Asia, Scoot (nb JetStar and JetStar Asia are different airlines)

Please note that these data are a few years old, are preliminary and not completely accurate, and airlines come and go on such competitive routes.

Two Funded PhD Studentships in Agent-Based Modelling at Loughborough University School of Business and Economics

I am looking for high quality, numerate, candidates to fill these exciting PhD studentships with me as a (co-) supervisor at Loughborough’s School of Business and Economics.  Please note that this post has been updated with new links (in blue, below).

The first is modelling dynamic responses to dynamic threats; the second is using analytics in traditional industries.  Please see the links below for further details and how to apply.  Note that for further information, you will need to click on the blue links below.

Modelling Dynamic Responses to Dynamic Threats (with Professor Gilberto Montibeller)

One of the most challenging issues for policy makers dealing with bio-security threats is their dynamic nature: diseases may spread quickly and deadly among vulnerable populations and pandemics may cause many casualties.

Finding the appropriate response to threats is a major challenge.  Whilst models exist for understanding of the dynamics of the threats themselves, responses can be largely ad-hoc or ‘firefighting’.  The aim of this research is to produce robust responses for dynamic threats.

The research will build up as follows, from low to high complexity: static responses to static threats; static responses to dynamic threats; dynamic responses to static threats; and dynamic responses to dynamic threats.

We will use a variety of methods to define the best response: cellular automata, network analysis, spatial modelling, agent-based modelling, and the generation of dynamic fitness landscapes.

This PhD studentship is most suitable for candidates with a background in a quantitative discipline such as management science, operations research, engineering, physics and other natural sciences.

Business Analytics for Public Services and Regulated Industries: New Techniques for Analytics-Driven Decision Making in Traditional Industries (with Dr Maria Neiswand and Professor David Saal)

The rise of business analytics has given rise to enormous opportunities within the private sector, but these benefits have yet to be fully realized in public services and regulated industries such as energy, water, and transportation networks. Conversely, governments are mandating collection of data by installing smart metering devices. This gives rise to the need for innovative ways of thinking in industries that are still largely based on traditional economic thinking involving conventional assumptions on optimization and behaviour.

As an example, the energy sector is characterised by strongly defined market structures with incumbents and an ultimate need for energy network security, which not only prevents the quick adoption of technical changes but also translates into regulatory outcomes, such as price caps.

This exciting PhD opportunity will integrate theoretical and empirical approaches and spans two strengths of Loughborough’s School of Business and Economics: microeconomics and particularly rigorous analysis of the determinants of productivity and performance (including cost modelling) and management science (including simulation and network analysis).

We are therefore seeking a student with a quantitative background (whether in economics, management science, engineering, physics or other natural sciences). A willingness to learn new techniques such as, cost modelling, performance measurement, agent-based modelling and network analysis is desired.

 

Accessing Academic Journal Articles when away from a University

Now is the time when students, and indeed academics, are away from universities.  From having academic journal articles at your fingertips, you can find yourself confronted by previously good natured academic publishers persuading you to part with your hard earned cash in order to access an article.

This is not a good thing.

So, from hardest to easiest, here are methods for accessing those articles.  This is for the University of Oxford, but other universities have the same setup.

University Library Journal Search

For example http://www.bodleian.ox.ac.uk/ptfl/eresources/ejournals

Various journals have various sign in mechanisms, which is a bit of a pain, but you probably will be able to find what you are looking for here.

VPN

Many journals are accessible when connected within your university – access tends to be based on the IP address of your computer.  So, no access from outside.  To counter this, you can use a VPN – for example http://help.it.ox.ac.uk/network/vpn/index – you need to download VPN client software, and that should make it seem as though your computer is in Oxford, and you should have access.

Google Scholar

The other way of doing it is to search for the article title in Google Scholar.  See for example https://scholar.google.co.uk/scholar?hl=en&q=Agent-Based+Models+and+Behavioral+Operations+Research&btnG=&as_sdt=1%2C5&as_sdtp= . The pdf link on the right should give you access.  If that doesn’t work, you can sometimes find the right version by clicking on ‘All n versions’, as one of those may be the pdf.

Google

And if that doesn’t work, just Google the name of the article followed by filetype:pdf, for example https://www.google.co.uk/search?site=&source=hp&q=Agent-Based+Models+and+Behavioral+Operations+Research+filetype%3Apdf&oq=Agent-Based+Models+and+Behavioral+Operations+Research+filetype%3Apdf

Appeal to the Author’s Vanity

And if that all fails, write to the author(s) directly.  We are vain people, and there’s nothing better than to receive a request from a student, particularly if they are genuinely interested in the work.  Just type their name into Google, and their university webpage should show you their contact email address.

Agent-Based Models for Simulating Human Behavior: IFORS Conference 2017

This presentation including joint work with Alberto Franco, was presented at the IFORS (International Federation of Operational Research Societies) conference in Quebec City, QC, Canada.  We present two different agent-based models for simulating human behavior.

We use the example of group decision making.

The first model uses a cognitive fitness landscape to model the quality of a decision, where participants compare their decision with their nearest neighbor.  The decision is based on an external comparison.

The second model uses an internal comparison of a decision with the next best alternative.  The model is based on the psychological concept of hidden profiles, where participants only make the best decision by sharing information with the group.

The Conservative Manifesto: Care Fees as a Percentage of Initial Wealth

The Conservative Party have announced their manifesto for the 2017 General Election.  Included in this (on page 64-65) is proposed

We will introduce a single capital floor, set at £100,000, more than four times the current means test threshold. This will ensure that, no matter how large the cost of care turns out to be, people will always retain at least £100,000 of their savings and assets, including value in the family home.

A quick calculation on the effective ‘Tax’ (Care Fees as a Percentage of Initial Wealth) shows the following distribution of Tax Rates.  On the x axis is initial wealth (the value of your house plus any savings), and on the y axis is the Tax Rate.  The Conservative Party have since augmented this plan with a proposed cap (consultation to come).

Values used for Care Fees: £20,000, £40,000, £60,000, £80,000; £100,000.  Values used for Initial Wealth: £0, £100,000, …, £1,000,000.  The trend continues downwards after this figure.

As many have pointed out, this affects individuals with initial wealth just over £100,000 proportionately far more than those with higher initial wealth.  More detail on policy options for funding social care can be found in the Dilnot Commission report and a summary of their proposals is shown below:

 

What does the ℮ mark mean on packaging?

The ‘℮’ symbol, or the ‘e-mark’ is a symbol you will see on packaging such as tins or packets in Europe.  Millions of us will see this symbol every day, but what does it actually mean?

The raison d’etre for the e-mark comes from the problem of selling goods to the public.  We would all like to think that we are getting what we pay for, but does that mean we should always get what we pay for?

Well, if you use the ℮-mark, then no.  And yes if you don’t.  So you use the ℮-mark.  By doing so, some of us are short-changed, but, on average, we shouldn’t be.

The e-mark was introduced in 1976 by the legislation known by the snappy title of ‘Council Directive 76/211/EEC of 20 January 1976 on the approximation of the laws of the Member States relating to the making-up by weight or by volume of certain prepackaged products’.

This sets out a nominal value of a product.  This means that, on average, we should not receive less than the value stated before the e-mark.  But we would be really annoyed if we received, say, nothing, and someone else received twice the nominal amount.  So, the concept of tolerable negative error was introduced at the same time, to set out the minimum legal amount that each packet or tin or container should contain.  The idea is that only a few containers can weigh less than the declared value less the tolerable negative error (but none can be twice the tolerable negative error… that would be, well, intolerable).

In packets from 5 grams to 10 kilogrammes, the tolerable negative error varies from 9% (quite a lot) to 1.5% (not such a lot), the rationale being that it is easier to measure larger values with greater accuracy.

Excruciating detail can be found in The Weights and Measures (Packaged Goods) Regulations 2006.  It is interesting to note that the HTML version of the Regulations contain illegible formulae:

I leave it as an exercise for the lawyer to determine whether this would be a valid defence in criminal proceedings.

Eight Mile and the Emergence of Segregation

Eight Mile, epitomized by Eminem in the film of the same name, is a street in Detroit that marks the boundary between the majority white northern suburbs and the majority black neighborhoods closer to the inner city.

But what causes this segregation in the first place?

Hypothesis 1: The Central Planner

Zoning Map, 1930s, showing HOLC zoning, source: http://www.urbanoasis.org/projects/holc-fha/digital-holc-maps/

In Detroit’s case, as with many cities across the USA, it was, in part, due to the zoning of the city by the federal Home Owners’ Loan Corporation, which zoned the city into areas of risk, meaning that banks were indirectly encouraged to develop outer suburbs while not offering mortgages to inner city properties.  This lead to wealthier, generally white, residents moving to the suburbs.

Indeed, physical barriers, such at the Detroit Wall, also known as the Eight Mile Wall, were built to separate majority black and majority white neighborhoods.

Detroit Today

The legacy of these zones live on today, as seen in the map below from the 2010 US Census.  The dividing line between the green (black) areas and the blue (white) areas is Eight Mile Road.

DotMap http://demographics.virginia.edu/DotMap/ based on 2010 US Census

 

 

So, segregation exists, and is caused by a central actor. But is there an alternative explanation?

Alternative Hypothesis: Emergence

In 1971, Thomas Schelling set out to model the phenomenon, not by assuming a central planner, but by modelling the interactions of individuals.

Thomas Schelling’s model was this.  Assume individuals are placed in a grid, similar to being located on a chess board.  Allow individuals who are in a local minority to move.  In the example below, the blue circle is in a minority (with 5 out of its 6 neighbors being a different color), and according to the rules of the model, is unhappy.  It could decide to move to the vacant square to the north-west, but it would still be in a local minority (with 4 out of 6 neighbors being a different color) and would remain unhappy.  So instead, it chooses the space to the south west where 3 out of its 6 neighbors are of the same color, and not being in a minority, it settles there.

Agent Movement © Duncan Robertson after Thomas Schelling (1971)

Schelling, perhaps without knowing it, introduced agent-based modelling, where, instead of modelling the system as a whole, the modelling of individual agents enables us to see the emergence of macro-level properties, in this case segregation, via the modelling of micro-level (local) interactions.

We can see the effect of micro-level interactions causing macro-level segregation in the model below (developed by Duncan Robertson after Wilensky after Schelling). Each individual, or agent, decides whether they are unhappy or happy; if they are unhappy, they search until they find a vacant location where they will become happy.  This continues until all individuals attain happiness.

Three Class Segregation Model © Duncan Robertson after Wilensky after Schelling

So, perhaps segregation is not imposed, but is down to us.  Or maybe, in reality, it’s a little bit of both.

Please do get in touch if you would like to discuss building or working with agent-based models.

 

 

The Unintended Consequences of Unintended Consequences

In 2003, as part of a project to document coastal erosion in California, the following photo was posted on the californiacoastline.org website.  It is a picture of a beach, some cliffs, lots of trees, and a rather nice house complete with swimming pool, that turns out to be the home of one Barbara Streisand.

Copyright © 2002-2015 Kenneth & Gabrielle Adelman, California Coastal Records Project, www.californiacoastline.org

For those not in the know, Barbara Streisand, is, in the words of her lawyer, a ‘renowned singer, actress, movie director, composer, and producer’.

Now, it turns out that Barbara Streisand values her privacy.  To be specific, she puts a value of at least $10,000,000 on it.  When the claim was filed, six people had downloaded the image.  However, when Barbara Streisand issues a claim in the LA courts (in her own name) confirming that she lives in the nice house with the swimming pool, court reporters start twitching their notebooks.  And so, it came to pass that once the lawsuit was publicised, everyone wants to know what Barbara Streisand’s house looks like.  Nearly half a million in the first month.

The Streisand Effect, as it has been dubbed, is an example of unintended consequences.  By planning to do one thing (suppress an image), you do the exact opposite.

The addendum to the story is that Barbara Streisand had a resurgence in popularity, culminating in the nearly 100-million downloaded song Barbara Streisand by the popular beat combo Duck Sauce.

Maybe this was, after all, a masterplan to take advantage of the unintended consequences of unintended consequences.

Or maybe not.