Sunday 25 February 2018

Black-Sky Thinking



 Network of Stoppages - Marcel Duchamp, 1914


"For a moment they saw the nations of the dead, and, before they joined them, scraps of the untainted sky." - E.M. Forster, 1909



In the current furore about artificial intelligence (AI), there is a growing fear that machines will take on a life of their own and behave in a malign and uncontrollable manner (e.g. Observer 2016). There are many worrying aspects of AI, and some heartening ones, but there is a remarkably simple answer to talk of machines taking over, namely, why not pull out the plug? The insurgency of self-controlled machines is a staple of science fiction, and much of the respect for that genre stems from the fact that it often contains a metaphor for humanity's current mores and preoccupations. In this case, it is an indication of the extent to which we all take electricity for granted.

More than ever before in history, electricity is now our life-blood, and every day that passes this becomes more and more true. If anything threatens our survival, it is the absence of electrical current in the distribution system of high- and low-tension cables and wires. In fact, when NATO bombed Serbia in 1999 tacticians put a great deal of effort into bombing power stations with graphite (which short-circuits them) in order to render them inoperable and thus undermine the functioning of the enemy state.

Power failure is taken very seriously by utility providers and hospitals, but not by the general public or many businesses. For many decades we have been habituated to the idea that, if the power ever goes off, it will come on again very soon, and interruptions of service will be rare. They are a nuisance that forces us to suspend our activities, but that is all. This is a testimony to the dogged work of the electricity providers in ensuring supplies. It sets countries aside from those where, through lack of energy resources, inability to maintain networks, shortage of investment and growing demand, electricity distribution is not so stable. It sets us aside from the forgotten corners of the world where power grids and distribution networks have not yet arrived. They are either treated as romantic anachronisms or marginal places of little consequence. But what if electricity distribution did significantly fail? Both the causes and the consequences are likely to be quite involved (Luke 2010).

Much work has been carried out to protect electricity generation and distribution networks against progressive failures of the 'toppling dominoes' kind, characterised by a chain of protective isolations and shut-downs of the system (Terzija et al. 2011, Nateghi et al. 2016). However, with rising demand for electricity and diversifying supply, power distribution has become progressively more sophisticated, pervasive and internationalised. This has also created many more areas of potential vulnerability (Elizondo et al. 2002). Hence, 'cascading failure' is a term that is now less applicable to the physical relationships in a power network and more to the relationship between overall failure and chains of consequences (Chang et al. 2007, Bompard et al. 2009, Chiaradonna et al. 2011).

Despite all the work that has gone into making power generation and distribution resilient processes, natural hazard impact cannot be ruled out, and neither can technical failure (Maliszewski and Perrings 2011). Moreover, cyber attack cannot be regarded as a threat that is totally under control and will remain so (Stefanini and Masera 2008m Piccinelli et al. 2017). In this sense, the 2015-16 cyber attacks on the Ukrainian power grid have acted as a wake-up call to the electricity industry (Lee et al. 2016). In the last analysis, power supply will never be completely safe against widespread failure.

Because water supply and sanitation, fuel supply, food distribution and other services depend on the availability of electricity, there are grounds for regarding it as the primary form of critical infrastructure (Kröger 2008). It also provides some essential mechanisms through which critical infrastructure failure is linked to cascading disasters. In most places, the degree of dependency of society on electricity has not been tested by a prolonged, widespread outage (although around the world major events of this kind occur with a frequency of about once a year, and less consequential events orders of magnitude more often - Atputharajah and Saha 2009).

At present, we have a poor understanding of the degree to which we depend on electricity. Consider the impact of prolonged loss of power on food conservation and distribution. If motor fuel supplies cannot be pumped, food will rot in warehouses. If refrigeration fails, food will rot in situ. This may lead to a proliferation of gastro-enteritic diseases, as contaminated food is eaten, for example in the home environment, and it would certainly lead to a problem of how to dispose of large quantities and varieties of contaminated food. Hence, an extra burden on hospitals and a problem of rectifying the food supply chain would be consequences.

From advertising to sales and dispatch, commerce is now heavily, almost universally, dependent on electronic systems. Hence, interruption of electricity supply inevitably means interruption of business: the supplier cannot sell and the customer cannot buy. In such a situation, it will be interesting to see what degree of cushioning there is between interruption of service and bankruptcy. This came close to being tested in both "9-11" and the eruption of Eyjafjallajökull in 2010, each of which put a groundstop of about a week upon the airlines, leading to massive losses of revenue (Alexander 2013a).

How would a "cashless society" manage in the absence of electronic banking and electrically driven transactions? This problem covers a wide spectrum because it stretches from simple issues about paying for essential goods, such as food, to complex ones about major time-dependent electronic transactions, such as house purchase conveyancing.

One of the most significant and least explored elements of dependency upon electricity is the psychological side. For people who are completely habituated to communicating via social media and telephone, what would it mean to have to do without these devices (Wang et al. 2015)? This brings us to Barton's post-disaster 'therapeutic community' (Barton 1970). It is probable that a prolonged black-out would lead to more cooperation, social identification and self-sacrifice. It would tend to bring outcasts into the social circle rather than reinforce their exclusion. However, the other side of the coin is crime and social deviance. Despite the prevalence of the 'therapeutic community' and its reinforced consensus on what is right and proper, for criminals disaster is an opportunity (Zahran et al. 2009). Looting (Alexander 2013b) is not an inevitable consequence, but where appropriate preconditions exist (for example in deprivation, lawlessness or lack of social justice), it may be a significant outcome. The connection between electricity supply failure and looting has been well researched in its North American context (Muhlin et al. 1981, Wohlenberg 1982). However, before plans are laid to cope with a massive onset of looting as the lights go out, perhaps attention should be devoted to the presence or absence of preconditions and what they signify in terms of propensity or its absence.

If we became completely habituated to using digital technology, would we be able to think and act effectively in its absence? Information technology has caused people to retreat from reality, and at the same time it has made itself indispensable. If this seems to be too extreme an interpretation, an alternative view is that information technology has redefined reality. However, technological failure could redefine it again.

The Internet age has given a special sort of prescience to the renowned science fiction story that E.M. Forster wrote in 1909, The Machine Stops (Forster 1928). This is an apocalyptic tale of how universal dependency on technology leads to the breakdown of civilisation and the annihilation of all those who depend on it, except for a small group of people who have managed to break away and revert to a more natural form of living. Forster and his contemporaries faced the incubus of the First World War, in which the machine gun and poison gas did so much to show the prowess of technology on the killing fields. Yet it was not until the beginning of the nuclear age that people and their prophets began to see technology as genuinely capable of making an end to civilisation. Nonetheless, Forster's magisterial tale at least offers his readers a glimpse of future regeneration. Whether or not Forster was foreseeing something in the future, Domesday scenarios and the means of coping with them remain extremely difficult to think through (Bostrom and Cirkovic 2011, Denkenberger et al. 2017).

Forster's story relies much on automation, which is in turn dependent on the algorithms that make it function. The proliferation of algorithms is becoming a major influence upon modern life. All algorithms are models, and all models simplify reality. Good models are elegant simplifications and successfully extract the 'signal' from the 'noise' that surrounds it. However, the simplification process involves making assumptions, which in the end may be valid or false. By their very nature, as part of the modelling process, assumptions exclude information, observations and elements of reality. When algorithms fail, reality surges back, with all of its awkward complications and chaotic implications.

At present, we are not devoting enough attention to the question of how digital development creates vulnerability and dependency. Dependency, in fact, is the motor of vulnerability. All technology is ultimately fallible, but how vulnerable are we to its failure? The present tendency is to counter this fallibility with the application of yet more technology. Nothing could be more conducive to breeding the conditions for cascading failure. Will artificial intelligence and information technology failure create tragedy? If not, will it contribute to, exacerbate or multiply tragedies?

Technology has reorientated person-to-person communication. It has opened up new avenues, both good and bad, for leadership and for the management of public opinion. The spontaneous loss of the technology, for example in any form of prolonged failure of the equipment, will inevitably lead to resocialisation, but largely through a highly inefficient process of improvisation, of trial and error. Paradoxically, by creating massive redundancy, its very inefficiency may be the source of its richness and success. In the end if failure occurs on a grand scale (Pescaroli et al. 2018), afterwards, the attitude to technology will never be the same again.

References

Alexander, D.E. 2013a. Volcanic ash in the atmosphere and risks for civil aviation: a study in European crisis management. International Journal of Disaster Risk Science 4(1): 9-19.

Alexander, D.E. 2013b. Looting. Encyclopaedia of Crisis Management. In K.B. Penuel, M. Statler and R. Hagen (eds) Encyclopedia of Crisis Management, Vol. 2. Sage, Thousand Oaks, California: 575-578.

Atputharajah, A. and T.K. Saha 2009. Power system blackouts: literature review. International Conference on Industrial and Information Systems, December 2009, Sri Lanka: 460-465.

Barton, A.H. 1970. Communities in Disaster: A Sociological Analysis of Collective Stress Situations. Doubleday, New York, 368 pp.

Bompard, E., R. Napoli and Fei Xue 2009. Analysis of structural vulnerabilities in power transmission grids. International Journal of Critical Infrastructure Protection 2(1): 5-12

Bostrom, N. and M.M. Cirkovic (eds) 2011. Global Catastrophic Risks. Oxford University
Press, Oxford, 560 pp.

Chang, S.E., T.L. McDaniels, J. Mikawoz, K. Peterson 2007. Infrastructure failure interdependencies in extreme events: power outage consequences in the 1998 ice storm. Natural Hazards 41(2): 337-358.

Chiaradonna, S., F. Di Giandomenico and P. Lollini 2011. Definition, implementation and application of a model-based framework for analyzing interdependencies in electric power systems. International Journal of Critical Infrastructure Protection 4(1): 24-40

Denkenberger, D.C., D.D. Cole, M. Abdelkhaliq, M. Griswold and J.M. Pearce 2017. Feeding everyone if the sun is obscured and industry is disabled. International Journal of Disaster Risk Reduction 21: 284-290.

Elizondo, D.C., J. de La Ree, A.G. Phadke and S. Horowitz 2002. Hidden failures in protection systems and their impact on wide-area disturbances. IEEE Power Engineering Society Winter 2001 Conference, Proceedings, Columbus, Ohio: 710-714.

Forster, E.M. 1928. The machine stops (1909). In The Eternal Moment and Other Stories. Sidgwick and Jackson, London, 185 pp.

Kröger, W. 2008. Critical infrastructures at risk: a need for a new conceptual approach and extended analytical tools. Reliability Engineering and System Safety 93(12): 1781-1787.

Lee, R.M., M.J. Assante and T. Conway 2016. Analysis of the Cyber Attack on the Ukrainian Power Grid. SANS Industrial Control Systems, Electricity Information Sharing and Analysis Centre, Washington, DC, 23 pp.

Luke, T.W. 2010. Power loss or blackout: the electricity network collapse of August 2003 in North America. In S. Graham (ed.) Disrupted Cities: When Infrastructure Fails. Routledge, New York: 55-68.

Maliszewski, P.J. and C. Perrings 2011. Factors in the resilience of electrical power distribution infrastructures. Applied Geography 32(2): 668-679.

Muhlin, G.L., P. Cohen, E.L. Struening, L.E. Genevie, S.R. Kaplan and H.B. Peck 1981. Behavioral epidemiology and social area analysis: the study of blackout looting. Evaluation and Program Planning 4(1): 35-42.

Nateghi, R., S.D. Guikema, Y. Wu and C.B. Bruss 2016. Critical assessment of the foundations of power transmission and distribution reliability metrics and standards. Risk Analysis 36(1): 4-15.

Observer 2016. Artificial intelligence: ‘We’re like children playing with a bomb’ The Observer 12 June 2016.
https://www.theguardian.com/technology/2016/jun/12/nick-bostrom-artificial-intelligence-machine



Pescaroli, G., R.T. Wicks, G. Giacomello and D.E. Alexander 2018. Increasing resilience to cascading events: the M.OR.D.OR. scenario. Safety Science, 10 pp.

Piccinelli, R., G. Sansavini, R. Lucchetti and E. Zio 2017. A general framework for the assessment of power system vulnerability to malicious attacks. Risk Analysis 37(11): 2182-2190.

Stefanini, A. and M. Masera 2008. The security of power systems and the role of information and communication technologies: lessons from the recent blackouts. International Journal of Critical Infrastructures 4(1-2): 32-45.

Terzija, V., G. Valverde, D. Cai, P. Regulski, V. Madani, J. Fitch, S. Skok, M.M. Begovic and A. Phadke 2011. Wide-area monitoring, protection, and control of future electric power networks. Proceedings of the IEEE 99(1): 80-93.

Wang, C., M.K.O.Lee and Z. Hua 2015. A theory of social media dependence: evidence from microblog users. Decision Support Systems 69: 40-49.

Wohlenberg, E.H. 1982. The “geography of civility” revisited: New York blackout looting, 1977. Economic Geography 58(1): 29-44.

Zahran, S., T. O’Connor Shelley, L. Peek and S.D. Brody 2009. Natural disasters and social order: modelling crime outcomes in Florida. International Journal of Mass Emergencies and Disasters 27(1): 26-52.