Wednesday, 14 February 2018

Publish, Perish and Be Damned!

Academic publication has its elephant traps.

Publication is the end product of our research and one of the core activities of scholars and scientists everywhere. As the editor of major journals, I have been on the receiving end of the publication process for 32 years. During that period it has been abundantly clear that academic publication is a much misunderstood process. It has changed dramatically since I started editing in 1985, but, despite the constant metamorphosis, it is as misunderstood now as it was then.

When we write up our research we are endeavouring to communicate it. The first stage is to communicate with ourselves as we put our thoughts down on paper–or more likely the electronic equivalent. In this internal conversation, have we said what we meant and are we happy with our own expression? The second and more important stage is to communicate with readers, starting with the editor of the journal and the referees that he or she selects. Do they understand what is written? The third and final stage is to communicate with the research community at large. On this, one's reputation depends more than on many other activities and accomplishments.

Editors vary in their approach to a manuscript. Some are, in effect, referees, while others delegate much of the evaluation process to scholars or scientists that they have contacted, who are intended to be independent, impartial judges of the 'publishability' of the work. Referees vary from the meticulous to the sloppy, the appreciative to the scornful, helpful to obstructive. By and large, if there is a flaw in the paper, they will probably see it and take note.

In 1980 I sent an article to Environmental Management, a journal published by Springer in New York. It was one of my first writings and I was casting around for a suitable home. Environmental Management was the most attractive, professional looking journal on the shelf (bear in mind that we did not have digital resources then). Springer published it with clockwork efficiency and meticulous attention to detail, as I was to find when, five years later, I began a 17-year stint as its Editor-in-Chief. I mention this little episode, almost four decades ago, to illustrate the importance of appearance, professionalism and rigour. We all may feel that substance is more important that style, but in reality how things look has a very significant influence on how they are judged.

Last year, I received 1,061 manuscripts to edit. It is amazing how many of them were sloppily prepared. It was not altogether uncommon to find errors of English grammar or usage in the title of the work, the first thing that an editor or referee sees. It is even more common to find them in the abstract, along with that most elementary of mistakes: an abstract that is an introduction to the work rather than a precis of it. Most of the time, referees do their work reluctantly. It is another chore that we take on for love of the academic life and a sense of responsibility towards science, scholarship and the academic community. Rarely, we may actually want to read the manuscript and see what the author has to say. But referees do not want to review bad manuscripts. A poor quality title, a sloppy abstract, and the referee makes the decision not to bother. The editor has perhaps sent out a request for reviews under the premise that although the paper starts badly, there is probably a useful research message concealed in it somewhere. How few reviewers are willing to search for it! And yet the author needs to get the message, loud and clear, that the paper is not up to scratch.

Some authors vaunt their command of word-processing software by sending in a manuscript that is designed to look as if it is already published in the journal, even down to having the right masthead. Presumably they think that this will increase the paper's chances of being accepted. In reality it merely creates problems. It is very difficult to comment in detail on a double-column manuscript, and usually the smallness of the type font makes the paper difficult to read. Even if the paper is deemed acceptable, the copy-editor and typesetter would have to unpick the elaborate formatting, as they use a different form of software, thus adding to their workload.

Although it is strictly against the rules to submit a paper to more than one journal at once, authors routinely hawk their papers around from one serial title to another. Rejection by one provokes submission to another. This is apparent to the editor when it is clear that the formatting used is that of another journal - or another discipline. One could argue that on first review it does not matter very much, as nowadays almost all papers have to be revised before they are published, and this is an opportunity to put the formatting right. That is true up to a point, but when the format of the paper is widely divergent from that of the journal it does tend to imply that the author is not particularly committed to publishing in that particular venue. And why should the editor and reviewers be committed to giving the paper the green light?

The US Geological Survey have a strict policy that articles by their employees cannot be submitted for publication until they have been signed off by the USGS editorial office. As a result, the papers invariably demonstrate a level of professionalism in both the appearance and the content that others would do well to emulate. When you ready your article to go to a journal, double-space all of it, make sure there are page numbers and add line numbers (I prefer consecutive numbering of every fifth line - unobtrusive but effective). Failure to add the numbers is so common that I, as an editor, have a stock phrase ready: "Commenting in detail on this paper is hampered by the absence of page and line numbers." I also have stock phrases for the common errors of English, and I very often have to use them. Make sure that headings, sub-headings and referencing are consistent. It is a small matter but it makes a great deal of difference to the reader. Irritating a reviewer is not likely to get you a sympathetic review!

Obviously, the most important issue for submitting a manuscript is the quality of the science and scholarship that it embodies. The sections should be well-thought-out and should follow on in a logical stream. The arguments should be watertight. The literature should be competently reviewed. The paper should be well-focussed, without digressions, extraneous material or superfluous argument. For instance, when submitting to a journal based in a particular field, there is no need to write a general introduction to the field, as readers are bound to know the basics. Finally, the paper should have breadth of appeal. Most field or laboratory work is pretty small-scale, but the value lies in connecting it to a wider reality, for that is how science advances.

Most of what I have written in this short essay is self-evident and should be obvious. However, it is constantly surprising how few academic authors follow these strictures. The process of transforming thoughts into readable prose and scientific argumentation that can be shared is evidently a very imprecise one. Yet I am convinced that a little more care and attention can mean the difference between an article that is sympathetically reviewed, and an author who is respected, and an article that is summarily rejected.

Monday, 5 February 2018

London and Earthquakes


Londoners leave the city in advance of a third earthquake in 1750, but it never happened.

Obviously, London is not a city one normally associates with earthquakes and seismic damage, but there is such a connection, and it is quite surprising.

There are tales of the effects of earthquakes upon London and its inhabitants in at least a dozen cases before the 20th century. Other accounts may be hidden and it is quite probable that other earthquakes were felt but not written about in surviving literature. A few events have occurred with epicentres that were likely to have been under London, but the majority come from the seismogenic areas of the United Kingdom: the Midlands and Lincolnshire, north Wales, Kent and the English Channel into France and the Netherlands.

In this brief and unscientific account, intensities are given in the roughly coincident part of the main, relatively modern, scales (MM, MSK, etc.) and, because of the low values, magnitudes are quoted in the by now redundant and somewhat inaccurate Richter scale, ML, or local magnitude.

An earthquake is known to have rocked London in December 1164, but no details are forthcoming. Another occurred on 13th or 20th February 1247 with possible seiching of the Thames. Still others occurred on 14th December 1269, 11th September 1275 and 4th January 1299. A more substantial seismic event took place on Wednesday, 21st May 1382 at about 2 p.m. It caused enough seiching on the Thames to capsize boats, and it significantly damaged old St Paul's Cathedral. The earthquake occurred while a Synod of the Church was being held in Blackfriars in order to discuss dissenters. The Archbishop of Canterbury, William Courtnay, was prescient and level headed enough to attribute it to natural causes, something that his successors in the church rarely did. Overall, the 1382 eatthquake caused effects in London to about intensity VI. A further earthquake struck London on 23rd April 1449, and one occurred in Croydon on 25th May 1551.

One of the most widely discussed historical earthquakes was that which occurred on Wednesday, 6th April 1580, at 6 p.m.. It had an estimated magnitude of 5.7-5.8, and an inferred hypocentral depth of 20-30 km. Its estimated recurrence interval was about 200 years. The epicentre was either in the English Channel or in France south of Calais. London fell at the western end of a band of intensity VII effects that extended across the Channel to Lille and beyond. It appears that this event caused seiching in the Thames, if not a minor tsunami, and it definitely resulted in localised flooding. Several chimneys collapsed in London, a pinnacle fell from Westminster Abbey, and damage was particularly significant in Shorditch. Two young people, Thomas Gray and Mabel Everite, were killed by falling stones at Christchurch, Newgate. The boy died instantly and the girl succumbed several days later. The 1580 earthquake may have come from a source akin to that which caused a magnitude 4.3 event (with a hypocentral depth of 5.3 km) under Folkstone in 2007.

A more northerly source of seismicity lies in the North Sea, and it delivered tremors to London on 24th December 1601. This brings to mind the largest recorded earthquake in the British Isles, the 1931 Dogger Bank tremors (magnitude 6.1), which was also felt in London.

Relatively large earthquakes are also generated in north Wales and the Irish Sea area. Those of 7th October 1690 (magnitude circa 5.2) and 9th November 1852 (magnitude estimated at 5.3), had epicentres at Caernarfon and were felt in London.

A smaller event occurred at noon on Thursday 6th February 1750, with a magnitude of about 2.6. It was followed a month later, on 8th March, by a magnitude 3.1 event, which was experienced at 5:30 a.m. The shaking for this was violent in London, and the epicentres for the two events were estimated to have been near Leadenhall Street and near Lambeth, respectively. A rumour was propagated that earthquakes would occur monthly, which led to a mass exodus from London (and gridlock on the roads) on 8th April 1750. Needless to say, there were no tremors. The Church of England attributed the 1750 earthquakes to God's displeasure at the publication of Memoirs of a Woman of Pleasure ("Fanny Hill", by John Cleland, 1748-9).

A peculiarly destructive earthquake occurred at 09:18 on Monday, 22nd April 1884, with an epicentre at Wivenhoe, Essex. It lasted 20 seconds, and had an estimated magnitude of 4.6 and a hypocentral depth of about 70 km. In the Colchester area this event destroyed one mediaeval church and severely damaged at least four others. It damaged 1,250 other buildings in the area. One building, which was in a precarious state, was said to have collapsed in east London.

Other, more recent earthquakes that were felt in London include the 22nd September 2002 event at Dudley, West Midlands (magnitude 4.7), the 2007 Folkstone event mentioned above, and the earthquake of Wednesday, 27th February 2008 at Market Rasen, Lincolnshire (magnitude 5.2).

The leading expert on UK earthquakes and British seismicity is Dr Roger Musson of the British Geological Survey. He has warned that London is overdue for a damaging seismic event. Such is our critical infrastructure, that next time the effects are likely to be more serious and more complex than they were in 1580 and 1750.

Select Bibliography

Davison C. 1924. A History of British Earthquakes. Cambridge University Press, Cambridge: 332-335.

Guardian 2010. London is overdue for a major earthquake, warns seismologist. The Guardian, 16 September 2010.

Musson, R.M.W. 2004. A critical history of British earthquakes. Annals of Geophysics 47(2/3): 597-609.

Musson, R.M.W. and P.W. Winter 1996. Seismic hazard maps for the U.K. Natural Hazards 14(2-3): 141-154.

Neilson, G., R.M.W. Musson and P.W. Burton 1984. The  “London” earthquake of 1580, April 6. Engineering Geology 20: 113-141.

Scott, R.F. 1977. The Essex earthquake of 1884. Earthquake Engineering and Structural Dynamics 5: 145-155.

Sunday, 15 October 2017

Why the Hazards Paradigm Remains Stronger Than the Vulnerability Approach

Image result for hogarth 
One of the great paradoxes of disaster studies is the dominance of the hazards paradigm over the vulnerability approach. In 1983, Kenneth Hewitt and his colleagues published Interpretations of Calamity (Hewitt 1983), which cogently set out the arguments for regarding hazard as the trigger of disaster and vulnerability as the essence of the phenomenon. More recent attention to the underlying risk drivers (Blaikie et al. 2003) and disaster risk creation (FORIN Project 2011) have reinforced that view. But what do we see? Hazards-based approaches continue to dominate the field. Indeed, they continue to strengthen their dominance. There are ten reasons why this is so, as follows.

1. It is easier to blame disasters on a neutral agent, such as an extreme natural event, than on human decision making. Having stated this, it is becoming less easy as the full force of human-induced climate change become more and more apparent.

2. People, including scientists, tend to shy away from root causes, which can be complex, agonising and therefore intimidating. Vulnerability as a root cause is often a particularly difficult phenomenon to get to grips with as it tends to be multi-faceted, complex and insidious.

3. Political decision making is a major root cause of vulnerability to disaster. It is all too often divorced from rational advice and wedded to ideology. In the face of political forms of 'rationality', it is hardly surprising that it seems more attractive to study natural phenomena than the vagaries of human behaviour.

4. For many decades there have been massive investments in 'hard' science and no corresponding levels of support for endeavours to understand vulnerability.

5. There is a widespread and enduring belief in the 'technofix' approach to disasters. The bigger the problem, the more technology is needed to fix it. This is, of course, an ideological position in its own right. As it seldom succeeds, but remains wildly popular (especially among those who make a living out of selling technology), the result is that worsening conditions engender yet more dependence on technological solutions, and vulnerability continues to rise.

6. In many parts of the world, libertarianism dominates over regulation. Yet the conditions that produce vulnerability need to be regulated if it is to be brought under control.

7. The position of the social sciences is subordinate to that of the physical sciences in the world's academic systems. There is still considerable prejudice in scientific quarters against the 'softness' of social sciences, which are regarded as lacking in rigour because they do often not produce concrete or precise results.

8. There is a particular view of magnitude and frequency that acts as a framework for responding to disaster. I refer to the physical magnitude and frequency of events, not the magnitude of vulnerability.

9. Physical development (such as urban development and the building trade) is a juggernaut that often crushes dissent and restraint. It has enormous political support and it creates vulnerability by putting more and more assets in harm's way.

10. Finally, vulnerability is a paradoxical phenomenon. Like friction, it only really exists when it is mobilised (by impact) and therefore it must be studied either hypothetically before it manifests itself or post hoc after it has been converted into damage. It is thus much less tangible than the physical forces of hazards that can be measured in the field.

Taken together, these ten observations go a long way to explaining why the disaster problem is such a long way from being solved and, indeed, why it continually gets worse. Of course, there is no guarantee that a better understanding of vulnerability would lead to better management of it, but it is nevertheless clear that more and more knowledge of physical hazards does less and less for the process of reducing disaster.

References

Blaikie, P., T. Cannon, I. Davis and B. Wisner 2003. At Risk: Natural Hazards, People's Vulnerability and Disasters (2nd edition). Routledge, London.

FORIN Project 2011. Forensic Investigations of Disasters. Integrated Research on Disaster Risk, Beijing, 29 pp.

Hewitt, K. (ed.) 1983. Interpretations of Calamity from the Viewpoint of Human Ecology. Unwin-Hyman, London: 304 pp.

Sunday, 1 October 2017

On Integrity


The spectacle of President Donald Trump endeavouring to belittle the mayor of San Juan, about aid to Puerto Rico after the devastation wrought by Hurricane Maria prompts me to a rather personal reflection about the breadth of people's attitudes. The argument over aid is a squalid one and it betokens a squalid outlook by the dominant opponent.

Many years ago I formed a close friendship with a man who was 30 years older than myself, whom I shall refer to by his title and first name, Don Rocco. He was a retired medical doctor, of considerable stature in his profession. During his career he founded a clinic for the treatment of tuberculosis and established a hospital in an area that at the time lacked the most basic medical amenities. Don Rocco was a modest man in everything except his concern for the safety and well-being of his people. I came to know him because he lived in a region that suffered badly from natural hazards and he was keen to encourage researchers to come and study there, and to provide some answers to the problem of disasters.

Don Rocco was a man of remarkable integrity. Others enriched themselves and gained status out of their work with the poor and needy, or their efforts against hazards: he did not. He would always listen to people's concerns and, wherever he could, he would try to help. Not all those around him were as admirable. He and I got on well and we would take daily walks and tell each other our secrets. On one occasion, I met him coming out of the hospital he had founded decades earlier. His expression was grim and I asked him what was up. He replied, "I feel like a father who has just learned that his daughter is a prostitute." I did not ask him what he had learned that day in the hospital but I did what I could to revive his spirits. As others succumbed to base instincts, his stature simply grew. People from places near and far admired and respected him. The more squalid the behaviour of others became, the more Don Rocco was admired. He won a presidential gold medal, but in his study the only item he showed off was a facsimile of the Magna Carta, which was for better or worse the symbol of his faith in democracy.

Don Rocco lived on into his nineties and was finally buried in the small cemetery of his home town, on the hill, at the bend in the road, overlooking the valley where once, a thousand years ago, the Saracens passed by on their way towards conquest. When he died, the hospital and the clinic were named after him. Outside the latter, there is a fairly lifelike statue of him, the man of faith and integrity, the man who always set an example but without showing the slightest pretence or ostentation. Don Rocco will live on in my heart until I too cease to exist. In the meantime, I must confess that it is very difficult to come to terms with the fact that there is now a public monument to my close friend. Such is the human condition.

Thursday, 31 August 2017

Climate Change and Cascading Disasters

Flooding in central Bangladesh. (photo: DA)

Once again, disasters are topical. As usual, why they are topical rather depends on what else is featuring in the news at the same time. Floods in the southern USA and South Asia throw into sharp relief the possibility that climate change may already be causing extreme events to be larger and more destructive. Perhaps in the images of destruction and inundation we have a graphic illustration of an outcome that needs to be shown to people for them to believe it. Experts prognosticating in front of television cameras are not enough to convince the sceptics about climate change (let alone the hard-line deniers): what is needed is a good, solid floodwave.

But let me introduce a new element: cascading disasters. In essence, a primary impact such as rising floodwaters leads to a series of knock-on effects. But it does not stop there. The interaction of different sources of vulnerability means that effects can be transformed into new causes.

In 2002 flooding on the Moldava or Vltava River severely inundated the city of Prague, but also impacted the Spolana chemical factory, causing an explosion and a toxic cloud. As I write, something similar is expected at the Arkema factory in Crosby, Texas, as a consequence of flooding caused by Tropical Storm Harvey. Primary and back-up systems for cooling volatile chemicals have failed. Explosive or combustive reactions are expected. What will be their consequences? Time will tell.

On the other side of the world in the Indian sub-continent, commuters are being prevented from getting to work and children are being deprived of schooling by flooding that is greater in magnitude and impact than its American counterpart. A building has collapsed in Mumbai, killing and trapping its occupants, leading to a relief effort that must be added to that mounted against the effects of the floods and intense rainfall.

It may be that all future disasters above a certain size will be cascading events to a greater or lesser extent. This is because both the degree of mutual dependency and the growing complexity of society make such an outcome inevitable.

So what can we do about cascading disasters? First, we must recognise that the game has changed. The idea of disaster as simple cause-and-effect must be abandoned. Planning based on this assumption is likely to lead to the wrong remedies, or at least to inefficiency, with respect to both disaster risk reduction and disaster response.

Secondly, in developing strategies, tactics, plans and procedures, we must place the emphasis squarely on understanding vulnerability in all its forms. Commonly it is broken down into categories: physical, environmental, social, psychological, institutional, and so on. However, it also includes elements such as the risks of dependency upon technology, corruption, failure to innovate, and social polarisation. This means that vulnerability is best viewed as a complex, multi-faceted phenomenon. We must understand its mechanisms and the interactions between the facets. As has been written many times, disaster is socially constructed. It is the result of decisions made by individuals or groups, for they are those who put people and their possessions in harm's way. The study of cascading disasters involves the search for escalation points, at which vulnerability becomes compound and creates new "disasters within disasters". Remember that the Japanese M9 earthquake of 11 March 2011 was not the real cause of the Tōhoku disaster: that was the resulting tsunami and its effect on the Fukushima Dai'ichi nuclear plant. This is now one of the largest examples of a cascading disaster.

Thirdly, we must investigate the 'disaster pathways', which are the directions in which impacts propagate, including the 'escalation points'. This will give us the basis for anticipating the effects of a primary agent of disaster and either reducing them a priori or intervening to limit the damage.

In the twentieth century the concept of 'disaster' was viewed very much as one based on static relationships. From 1950, empirical studies of equilibrium were fashionable, and if a system failed to achieve it, then 'homeostasis' could be invoked, or in other words the system was assumed to have a tendency to return from perturbations to its equilibrium, and thus to have a 'central tendency'.

The agenda has changed, and so should the outlook upon disasters. Physically, we have climate change; socially we have population growth and socio-economic polarisation of wealth and opportunity. We also have rapid changes in the global outlook coupled with increasing international interdependency. Seldom has vulnerability looked less stable.

The current floods in the USA and South Asia reveal the gaps and weaknesses in planning, with respect to both risk reduction and disaster response. Rather than cutting budgets and turning away from disaster risk reduction, decision makers need to devote far more resources to the problem--and, of course, to take cascading into account. This will require a shift from a 'technofix' approach that stems from hazard reduction to one based on vulnerability reduction. Many of us in the disaster studies community have been saying this for at least three and a half decades, vox clamantis in deserto. It is now, more than ever, economically and politically advantageous to listen to us.

Saturday, 5 August 2017

In Europe we're all going to die in disasters - or are we?


The top news on the BBC website this morning was that "deaths in Europe from extreme weather events could increase 50-fold by 2100". In my opinion, there are two lessons to be drawn from this.

The first is that the authors of the study (Forzieri et al. 2017) were very clever to release it at the time of maximum impact. As I write, the temperature outside my room is in the 40s Centigrade. The article was embargoed until 11.30 last night and pre-distributed to the mass media. Small wonder that today it got maximum exposure.

The second is that the research is pretty much worthless. It is misleading and highly unlikely to offer an accurate forecast. It is a hazards-driven study that effectively uses exposure as a surrogate for vulnerability, about which the authors have remarkably little to say (see my comments in Davis 2017). And yet it has been demonstrated all over the world that vulnerability defines death tolls - i.e., people can live in highly hazardous zones and not die if they are not vulnerable (Wisner 1993). Various African countries, India and Bangladesh have all had some notable successes in reducing disaster mortality in areas of high population growth (e.g. Paul et al. 2010). Moreover, one of the effects of the International Decade for Natural Disaster Reduction was to hold the line on death tolls (it would have been nicer if they had gone down, but anyway, it was an achievement of sorts).

By way of illustration, the current heat wave is probably going to be comparable to that of 2003, during which it is estimated that there were 70,000 excess and premature deaths (Lagadec 2004). The figure is highly contentious, but, leaving that aside, since then measures have been put in place to avoid a repetition (Boyson et al. 2014, Pascal et al. 2012). These are mainly early warning systems to detect and assist vulnerable people. In Tuscany, where I am writing this, they have been highly effective, and I believe they have in France and Spain, too.  In the United States, as population rose, heat-related mortality declined (Sheridan et al. 2009). In contrast, Forzieri et al. (2017, p. e206) forecast that heatwave deaths in southern Europe will go up by 7,000 per cent in a century. If that were so, perhaps our work in disaster risk reduction would be a waste of time.

People put faith in figures because they seem precise and scientific, even when the reasoning that supports the figures is a hollow shell. The good side of the article is that it draws attention to the problem - or to part of it (and what a pity it does not draw enough attention to the extreme dynamism of vulnerability!). The bad side is that policy may end up being based on projections that are largely fantasy. There may indeed be massive increases in mortality in weather disasters in Europe, but that would be a function of many other factors - whether there is conflict, the impact of cascades, the functionality of antibiotics, emerging threats and hazards, dependency on critical infrastructure, the status of emergency preparedness, exotic diseases, the wealth differential, etc...

References

Boyson, C., S. Taylor and L. Page 2014. The National Heatwave Plan: a brief evaluation of issues for frontline health staff. PLoS Currents Disasters 13 January 2014.

Davis, N. 2017. Extreme weather deaths in Europe 'could increase 50-fold by next century'. The Guardian 5 August 2017.
https://www.theguardian.com/science/2017/aug/04/extreme-weather-deaths-in-europe-could-increase-50-fold-by-next-century

Forzieri, G., A. Cescatti, F. Batista e Silva and L. Feyen 2017. Increasing risk over time of weather-related hazards to the European population: a data-driven prognostic study. Lancet Planetary Health.
http://www.thelancet.com/journals/lanplh/article/PIIS2542-5196(17)30082-7/fulltext

Lagadec, P. 2004. Understanding the French 2003 heat wave experience: beyond the heat, a multi-layered challenge. Journal of Contingencies and Crisis Management 12(4): 160-169.

Pascal, M.,  K. Laaidi, V. Wagner, A.B. Ung, S. Smaili, A. Fouillet. C. Caserio-Schönemann and P. Beaudeau 2012. How to use near real-time health indicators to support decision-making during a heatwave: the example of the French heatwave warning system. PLoS Currents Disasters 16 July 2012.

Paul, B.K., H. Rashid, M.S. Islam and L.M. Hunt 2010. Cyclone evacuation in Bangladesh: tropical cyclones Gorky (1991) vs. Sidr (2007). Environmental Hazards 9(1): 89-101.

Sheridan, S.C., A.J. Kalkstein and L.S. Kalkstein 2009. Trends in heat-related mortality in the United States, 1975-2004. Natural Hazards 50(1): 145-160.

Wisner, B. 1993. Disaster vulnerability: scale, power and daily life. GeoJournal 30(2): 127-140.

Tuesday, 1 August 2017

Seven Rules for the Application of Operations Research to Disaster Management


It is currently very fashionable to apply the methodologies of operations research to disaster mitigation, management and response. Is this a fashion or a fad? Will the algorithms be used and appreciated, or are they merely wasted effort? Do the algorithm makers understand what conditions are like in a disaster, and what the real needs of managers and responders are?

In disaster management there is a well-founded hostility towards over-sophisticated routines and equipment. Managing emergencies will always be a rough-and-ready process, in which most of what is done is a kind of approximation. Such is the nature of uncertainty and rapid change in the field that it could never be otherwise.

If operations research is to make a useful contribution to disaster management, it will have to take account of these principles:-

1.    In emergencies, 'optimisation' is a very relative term. Pre-planned activities require considerable scenario modelling in order to take account of the real needs that will be generated during a future emergency.

2.    Optimisation based on an assessment of pre-disaster conditions is unlikely to be relevant to the post-disaster situation. Infrastructure will be damaged, inefficient and probably partly non-functional.

3.    Optimisation that assumes perfect knowledge of the situation is bound to fail. During major emergencies, the common operating picture is constructed slowly and with difficulty. One cannot optimise a situation that is partially unknown.

4.    Algorithms that are designed to be used in emergency situations should be capable of deployment during emergencies. This means that at the height of a crisis time cannot be expended on collecting data or running lengthy analyses.

5.    To make an algorithm credible, evidence should be provided that it is acceptable to field commanders, who would use it or act upon the results that it provides. Optimisation is not an objective held by most emergency managers and field commanders. An algorithm that does not take account of their needs and ways of thinking is highly unlikely to be appreciated or utilised by them.

6.    Decision support systems are welcomed if they really do support decision making. No sensible emergency manager would put blind faith in an algorithm unless the results clearly demonstrate that it works and visibly improves the situation.

7.    Flexibility is an essential ingredient of any algorithm. In disasters, conditions on the ground can change abruptly and without warning. Algorithm makers need to understand the difference between 'agent-generated demands' and 'response-generated demands', as described in the classical literature on the sociology of disasters.

Tuesday, 18 July 2017

The 'Should Ratio'


The word 'should' is damnable. We all should. There are probably ten or twelve things I should be doing now instead of writing this, and umpteen that I should have done but have not finished. But 'should' is a more serious business when it is applied to official documents. The 'should ratio' is the number of times the word 'should' appears per page of text. For UN Habitat's New Urban Agenda it is 0.33; for the Sendai Framework for Disaster Risk Reduction it is 0.37; for the UN's Sustainable Development Goals it is 0.4. But for the Oslo Guidelines on the Use of Foreign Military and Civil Defence Assets in Disaster Relief it is a whopping 1.93. Perhaps we can excuse the Oslo document because it is a set of guidelines, not a formal treaty.

The point about the 'should ratio' is that 'should' is a weaker word than 'will' or 'shall' or 'must'. We live in an increasingly fluid world in which, paradoxically, as the imperative to act increases, the will to do so declines, and along with it the sense of global responsibility. On the one hand, countries and their governments cannot be compelled to act, and some even resent being told that they should act: witness the response of the Trump administration at the G-20 meeting to the climate treaty. However, I firmly believe that should is a word to avoid. In academic papers, when the discussion and conclusion sections start "shoulding", the reader knows that they are in the process of delivering prescriptions that no one will heed.

I urge you, gentle reader, to make use of the 'should ratio'. It is very easy to compute. In a PDF document, a search function will tell you how many 'shoulds' appear and a calculator will tell you how often this is per page. Please go on to name and shame the writers who overuse the word.

Perhaps in the future a piece of research will tell us what is an acceptable should ratio, if there is such a thing. In the meantime, this blog resolves that the word 'should' should be replaced by the word 'must' in all worthwhile initiatives to reduce disaster, curb pollution, stop poverty, diminish vulnerability, increase safety and security, etc. Should be replaced...

Monday, 10 July 2017

The Tinderbox Tower (2): Disaster in the Sky




In 2001, I was appointed Scientific Director of the training school in civil protection run by the Regional Council of Lombardy. I was based in Milan, Italy. One of my first tasks was to train 28 senior emergency managers. As some of them worked in the 32-storey Pirelli building, and as it was not long after the World Trade Center disaster in New York, I set them the task of devising an evacuation plan for the Milanese skyscraper. Shortly after I had given them my analysis of the events in New York during the terrorist attacks known as "9-11", a light aircraft was deliberately flown into the Pirelli Building, killing one occupant and the pilot and starting a fire. I was relieved to find that the evacuation plan worked: it was based on careful investigation, clear thinking and rigorous drills for the users of the building.

Back in the UK, a couple of years ago I attended a trade fair for emergency service equipment. The organisers presented me with a copy of the 1974 disaster movie Towering Inferno. I thought at the time that this was a rather crass gesture, but as I was making a study of disasters and popular culture, I sat down and watched the film. It is a rather silly movie: heroes, villains, bungled escapes, all the usual ingredients of the genre. At the end, there is a conversation between a fire chief and the architect who designed the building that burnt down. The latter, who evidently uses his head merely as a place to park his architect's hat, has never thought about the flammability of the building and the fire chief has to convince him that now, after the disaster, is the time to give such matters his attention.

Here we are, 115 years after New York's Flatiron Building was erected as one of the very first skyscrapers, with more than a century of accumulated knowledge about how to improve the performance of tall buildings under duress. And yet in London, 43 years on, it is bizarre to see The Towering Inferno become a sort of self-fulfilling prophecy. The Grenfell Tower, or "tinderbox tower" as one resident called it, was at one point a sheet of flame that stretched from the ground to 24th floor. How is that possible, given all that is known about the performance of tall buildings and inhibiting the spread of fire? And what about the electrical surges of 2013 and the naked gas pipes that so frightened the residents? All too often there is a yawning gap between what we know about safety and what we do about it.

In the 1980s and 1990s the British sociologist Anthony Giddens and his German counterpart Ulrich Beck defined the "risk society", a concept that became very popular among students of modernity. Risk is a function of our preoccupation with safety, and the risk society is our way of fighting against the threats and uncertainties caused by the increasing pace of modernisation, or so they argued. For me, this view of modernity is too technocratic. I believe we live in a vulnerability society. It is not the desire to quantify risk and put it in conceptual pigeon-holes that defines modernity, but the political decision making that condemns groups and whole classes of people to be especially vulnerable to disasters and other adversities. We share the risks unequally. One thing that is particularly striking about the aftermath of Grenfell Tower is that there is little sign of redress. So far, those who were marginal and at risk before the disaster continue to be marginal after it.

Oh, and I forgot: in Britain we don't have disasters, we have 'major incidents', as the official language terms them. But, viewed without the cultural belittling, it was indeed a disaster, and one that reverberates around the nation by revealing a whole landscape of forgotten, ignored vulnerability, so much of which is morally and ethically unacceptable.

As a precaution, Transport for London closed the underground lines that run through Latimer Road station, which is almost in the shadow of the blackened wreck of Grenfell Tower. One suspects that the transport planners had in mind Exercise Unified Response, the £800,000, six-nation disaster drill which was conducted in London in February and March 2016. In this, emergency responders dealt with a simulated emergency situation in which a tall building had collapsed onto a tube train. The wreckage and the rubble were packed into a redundant power station in Dartford. The trains were gone, but how many of the residents who live closer to it were evacuated? What did the housing managers have in their minds when they let the residents stay? The consequences of radically different attitudes to risk could be encountered in but a few hundred square metres.

Walking to the site shortly after the fire from the nearest tube station, Holland Park, one passed through a landscape of Porches and Bentleys, gracious Georgian houses and well-swept streets. Suddenly, around the corner there was a gloomy football pitch squashed under Westway, rusty fencing, concrete gardens, barracks-like social housing. And then came all the signs of agony, anger, distrust. In some respects, it reminded me of Miraflores in Lima, Peru, where the houses of the rich are carefully defended against the tide of 'informal' housing of those who are not fortunate enough to live in comfort and luxury.

Emergency response systems are usually mosaics. Like the curate's egg they tend to be good in part. There are social, political and economic explanation of why this is so, but one of the main reasons is that they depend on actual people: a good leader, a good organiser, committed supporters, all of them can transform a failing system into a functional one. In October 1999, the Royal Borough of Kensington and Chelsea had the Ladbroke Grove train crash in its back garden. This 'major incident' (31 dead, 520 injured) was managed competently. On the other hand, it was not exactly an indigenous matter, as the survivors quickly moved on. Eighteen years later, it is breathtakingly awful to see the richest borough in the land make so many classic mistakes as it struggles to respond to disaster on its doorstep. London's emergency arrangements are not bad. indeed, they are excellent in some crucial respects. But a failing of the British system is the relatively loose connection that exists between planning for an emergency, on the one hand, and managing and responding to it, on the other. So much can "fall between the cracks".

My colleagues and I have developed a fascination with the 'transitional phase' of disaster aftermaths, the period that is supposed to connect the phase of intense activity in the initial response with the more measured phase of reconstruction, in which long-term solutions are devised to the problems caused by disaster. We have been studying this in a variety of settings: the Philippines after Cyclone Haiyan (2013), Mexico after landslides and floods, Japan after the tsunami and nuclear release of 2011. We find that in some cases--the worst ones--it isn't a transition at all, or rather it is a transition from nothing much to nothing else. Absence of clear strategies breeds lack of trust in authority, loss of confidence and a fear of the future that, sadly, is often well-founded. Is this what we see in north Kensington, a transition from precarious marginalisation to more of the same?

In the great practical field of endeavour that is now known as 'disaster risk reduction', there is a sort of reverence for the idea of 'community', as if it were a universal palliative to a whole catalogue of ills. In reality, communities can be therapeutic, but they can just as easily be vehicles for division, dissent, distrust and disassociation. Its borough may be a divisive place, but North Kensington in adversity is a good kind of community: I suspect it is very much the best of its kind in Kensington and Chelsea, if we also give recognition to the Chelsea pensioners. At this point, "working with the community" comes into play. Everything we know about the local scale in disasters suggests that imposing solutions onto people is a bad idea. Doing things for people is by no means as helpful as doing things with them, and for many issues it is better to support them while they solve the problems themselves. The ideal situation is one in which those whose decisions and actions caused the Grenfell Tower fire 'own' the problem, while those who survived it 'own' the solution. And may that solution be a catalyst for rectifying the entire national 'landscape of negligence' that is being slowly and steadily revealed as the weakness of fire regulations becomes apparent. If the community is to triumph, those who provide the assistance, run the enquiries, put things right, and speak for the community need to remember the motto of E.M. Forster: 'only connect'.

Sunday, 18 June 2017

The Tinder-Box Tower: Fire and the Neo-Liberal Model of Disasters


During the night of Wednesday 14th June 2017 a fire developed in Grenfell Tower, a 24-storey residential block in North Kensington, London. The building was quickly engulfed by the flames and within 24 hours it was a burnt-out wreck. At the time of writing, the death toll has not been established, but it is probably between 70 and 90 people. Many of them were trapped on upper floors by the fast rising flames. The emergency response was massive and extremely rapid. This was only the third 40-engine fire response since the 1960s. It was the first time for many years that more than 100 firemen were committed to a highly dangerous environment. Despite their professionalism and heroism, at that point no fire-fighting operation could have stopped tragedy from unfolding.

It is a principle of the construction of tall buildings that in the event of damage to one floor progressive collapse should not happen. This principle was consolidated after a gas explosion in 1968 at Ronan Point, a 22-storey residential block in east London, led to the domino-like collapse of an entire corner of the building. In this case, the lesson was taken on board and governed practice. Another well-known principle is that fire should not be able to leap from floor to floor and thus climb the building. The speed and ferocity with which it did so at Grenfell Tower were quite extraordinary. Here, the lesson of past events was not transformed into safer practice. The building lacked a sprinkler system (whose installation would have cost less than 2.5 per cent of the cost of the renovations that took place over the period 2014-2016). Instead, for insulation purposes, the building was clad with panels that were not fire resistant, and the question now arises as to whether the building codes were observed, or whether they were at fault.

A further element of the Grenfell Tower disaster is that it exposed the gap in living conditions between wealthy and poor residents. The location of the tower, the Royal Borough of Kensington and Chelsea, is the wealthiest residential district in the United Kingdom, with the highest property prices. The northern part of the borough presents an entirely different picture. In fact, it is one of the most deprived enclaves of London and the United Kingdom. It is evident that the resources of the borough have not been channelled into making living conditions safe for the residents. The contrast between extreme wealth and relative poverty in the same local district is a salutary reminder of socio-economic conditions in 2017 and an illustration of the consequences of almost half a century of the divergence of living standards between rich and poor.

In the days that followed the fire there was much debate about the scope and quality of the response, about the division and assumption of responsibilities and about the fact that the Grenfell Residents' Association had spent four years warning the building's owner and operator about fire risks.

The neo-liberal model of disasters suggests that they are often used as a means of consolidating power and exploiting the poor and needy (Klein 2008, Loewenstein 2015). Here is a case which illustrates neoliberalism in its other guise, in which the leaders and arbiters of society care very little or not at all about the conditions of risk under which the poor live. The Royal Borough of Kensington and Chelsea is the richest local authority in Britain: why could it not support its own residents, and why did it condemn them to live in patently dangerous conditions? The answers to these questions lie in political priorities and how they are formulated, marketed and supported.

References


Klein, N. 2008. The Shock Doctrine: The Rise of Disaster Capitalism. Penguin, Harmondsworth, 576 pp.

Loewenstein, A. 2015. Disaster Capitalism: Making a Killing Out of Catastrophe. Verso Books, London, 376 pp.