Browsed by
Tag: interdependency

Book Review: Catastrophe and Systemic Change by Gill Kernick

Book Review: Catastrophe and Systemic Change by Gill Kernick

Reading Time: 4 minutes

 

Every now and then a book comes along that is so spot on, you can’t believe it hadn’t already been written.

That’s the case with Gill Kernick‘s book Catastrophe and Systemic Change: Learning from the Grenfell Tower Fire and Other Disasters.

Book cover: Catastrophe and Systemic Change by Gill Kernick

Gill lived on the 21st floor of Grenfell Tower between 2011 and 2014. We all have our own recollections of the morning of 14 June 2017. Like many of us, Gill watched the Tower burn. Unlike many of us, her former neighbours were amongst the 72 people who tragically died. Her book is dedicated to them.

In part, the book is an exploration of the systemic issues behind why we don’t learn from disasters. Kernick has worked in high hazard industries and brings examples from there as well as from other disasters to show repeated failures to bring about post-disaster change. But her book is also intensely personal, and in parts reads like a diary, like she is making sense of her own emotions and thoughts, and processing all of this during a pandemic when other learning is also falling by the wayside.  I didn’t expect the book to make me emotional, but it did.

Before discussing what I found particularly resonant about the book, a little disclaimer. I’ve been involved in the response to Grenfell since the early hours of that night in 2017. I’m still involved now. The Grenfell Tower Public Inquiry is continuing. For those reasons, I won’t be commenting on chapters 1 and 2 which consider the specifics of Grenfell, but will focus on the themes of learning and systemic change.

The book catalogues failed opportunities to learn. There is a whole swathe of documents, reports, investigations, inquiries and research that show that despite assurances rarely do ‘lessons identified’ translate to ‘lessons learned’ at either the scale or pace required. And this continues, on 24 June 2021 at the Manchester Arena Inquiry, there was considerable discussion about the collective failure to learn lessons from an emergency exercise. 

Right there on page 6 my frustration is in black and white “the system is designed to ensure we do not learn”.

My biggest personal professional frustration is the repeat lessons. To ‘learn’ something, again and again, is to demonstrate that it is not actually being learned or addressed sufficiently. I’ve worked in organisations where the focus is on putting in place systems and processes to ‘help’, but which all too often just result in shuffling bits of paper around.

Kernick draws a distinction between piecemeal change (which involves looking at the system and making changes within it) and systemic change (which considers the conditions and cultures within which the system operates). She asserts that systemic change requires disruption of the status quo, but observes that kindness can be more disruptive than aggression.

We live in a complex, messy, often unpredictable world. I think it gives us a sense of comfort to think that we are ‘in control’ and can forecast what will happen in a given situation, but the reality is that emergent behaviours and complex dynamics between systems mean that we are only just scratching the surface.

I’ve blogged previously about work to understand emergent behaviour (‘sit in the messiness’ and ‘pop up emergency planning’), interdependencies between systems as well as my desire to see more empathetic approaches towards emergency management. It’s heartening to see that somebody outside the emergency management field also sees the same issues. It gives me a new resolve to try to address them.

To operate effectively in an increasingly complex world, Kernick suggests that governments need to change how they approach public engagement. I’d go further; this is not limited to engagement, governments need to embrace flatter, more organic structures for emergency response and move beyond ‘command and control’.

Emergencies and disasters often have high levels of uncertainty. This calls for what Kernick refers to as ‘the democratisation of expertise’. None of us individually have all the answers; we need to work together to make sense of a situation and determine a course of action. It’s an unspoken principle that runs through emergency management. That’s why we have COBR, once described to me as “Whitehall in miniature”, which brings together a bunch of people to find a collective answer. The same applies to Local Resilience Forums and Strategic Coordinating Groups. They are structures that allow knowledge and expertise to be pooled.   

And those structures need to be more representative of the communities they serve. We need people with different lived experiences to shape a response that will be better for everyone.

Kernick moves then to consider the role of accountability and scrutiny in Government. The conclusion generally seems to be that structures for scrutiny are okay, but the willingness or ability of governments to act on that scrutiny is low. There is no structure that can compel public inquiry recommendations to be addressed. Similarly, the Prevention of Future Deaths reports and rail industry incident reports and many others too. They all swirl around, unaddressed, in a soup of known issues ready to boil over the next time there is an incident.

So, why don’t we learn? It’s a question I come back to a lot and which this book has helped me to explore. Through the book Kernick goes on a journey about learning, expressing what initially seems to be curiosity, but then becomes increasingly frustrated and ultimately becoming incredulous. I’m not quite at that stage just yet, but I do think that there is a requirement to turn the mirror on ourselves and really examine the conditions and beliefs which we hold on to, which might be stopping us from making more progress.

And so, we come back to where we started, that systemic change requires “a tribe of disrupters” and I hope this book galvanises emergency managers across the land to be braver and to disrupt with kindness.

Book Review: The Premonition by Michael Lewis

Book Review: The Premonition by Michael Lewis

Reading Time: 5 minutes

 

This is the first book review I’ve written since being in secondary school, which…well, was a while ago, so go easy on me. I was inspired by a tweet a few weeks ago…

There has been some chatter both online and offline recently about the ‘visibility of emergency management’. Professor David Alexander’s article last summer asked “where are the emergency planners?“. The Emergency Management Growth Initiative has been seeking to bring greater awareness. And there have been recent challenges to the narrative that ‘plans didn’t exist’ for the UK response to the COVID pandemic. 

Generally, there’s a view from within, that that emergency management needs to be more mainstream, especially in the minds of political leaders. 

Over the last 9 years I’ve also tried to use this blog as a way to bring greater visibility to emergency management issues; most directly in an early post about breaking out of the bunker, which is simultaneously the natural habitat of the Emergency Manager but can also be what holds us back as a profession.

It was with great excitement that I ordered Michael Lewis’ book The Premonition, about a group of like-minded (and like-frustrated!) individuals who know that something serious needs to be done about pandemic planning. The book tells how a small group initiated and then performed repeated course corrections to US pandemic planning in the face of indifferent, layered, and fragmented bureaucracies. Speaking about the Swine Flu pandemic of 2009 one of the cast notes “there was no one driving the bus” and that despite pockets of good work across the country, the formal bodies people looked to for leadership (the Centre for Disease Control gets an especially scathing review) were deeply dysfunctional.

The book repeatedly asks the question “What happens when the people in charge of managing the risks have no interest in them?”. Pretty much every time it circles back to passionate people fighting to be heard and finally breaking through (often to be un- or under-appreciated).

Like Love Actually, there are several intertwined stories at play. Initially, each of the main characters (they’re actually real people) are doing their own wonderful things in splendid isolation, solving local problems using local means. But characters are brought together through chance meetings, introductions or happenstance, and realise their collective power.

One observation is that for a Public Health Officer in the States, there is no defined career path. I’ve heard similar representations about Emergency Management. This is thought to represent a problem because it means such a diversity of approaches and backgrounds and therefore a lack of a common approach. However, I would argue that this allows multiple perspectives to be more easily readily and more organically, but agree that some standardisation could be beneficial.

Like in an emergency, rapid response is vital to control and reduce the impact of disease outbreaks. The response to outbreaks and emergencies often needs to be instinctive, Kahneman’s ‘System 1’ rather than the more considered ‘System 2’. As one of the protagonists remarks about a Hepatitis C outbreak “if we had waited for enough evidence to be published in journals then we would have already lost,” and similarly, later in the book talking about wildfire response, someone remarks “you cannot wait for the smoke to clear – once you can see things clearly it’s already too late.”

Active vs passive choice seems to be another recurring theme throughout The Premonition, reminiscent of the Trolley Problem:

In particular, there is a chapter that considers a response to potential health issues following a Californian mudslide and one of the stars of the book is described as “She processes information quickly and spits out a decision fast, that makes people nervous. You don’t find people like that in government.”

Considering the profession, or at least the decision-makers background, there is an observation that the Homeland Security Council was “staffed by military types who spent their days considering attacks from hostile foreigners, not the flu” and that this had the effect of cognitive narrowing, choosing to not see the things which were unfamiliar. 

One of the characters talks about how they wanted to try to get the President, then George W Bush, to pay attention to the widespread impact that a serious pandemic could have across all society, not just healthcare. I was particularly amused that rather than formal submissions and briefings, actually what got the President interested was providing him with an annotated history book.

An intensivist doctor talking about touch clinical decisions remarks that “I felt like my best when shit hits the fan. I focus like a laser when everything is going to shit” and someone else mentions “You are going to make mistakes. The sin is making the same mistake twice and best is to learn from other people’s mistakes.”

The Premonition isn’t a popular science review of pandemic interventions and response strategies. Although, if there is a Hollywood adaptation (like Lewis’ Moneyball) then there would be parts for Selena Gomez to reprise her role in explaining dense public health theories and concepts. There’s an extended section which compares 1918 influenza pandemic interventions in Philadelphia and St Louis and supporting evidence which indicates “cities that intervened immediately experienced less disease and death” and further that cities which “caved to pressure from businesses to relax social distancing then experienced a more severe second wave.” 

Lewis also presents research that concludes that you “couldn’t design a better system for transmitting disease than the school system,” which got me thinking about perceptions, and why there is a persistent view that closing schools is a bad idea? Surely it’s only a bad idea if it is done badly?

The book notes how we are notoriously bad at understanding statistics and complex dynamics. Exponential growth is hard for us to visualise beyond the first few steps. Lewis provides an example of folding a piece of paper 50 times being able to reach a distance of 70 million miles. It just doesn’t seem right.

What comes through most clearly is that more often than not this doesn’t come down to expertise or evidence. Success often is the result of people who work around the system. Individuals with passion projects that compensate for the failings and deficiencies of their organisations.

My own passion project has been to try and better surface and understand interdependencies between different systems. It’s easy to become a specialist in your own field, but to see how that connects and relates to other areas is less common. My Anytown project started off as a way to try and convey the ‘whole society’ impact of various scenarios. The Premonition covers some of this in a short section that identifies the pressures on the production of nasal swabs which are only manufactured in three locations worldwide and are in extreme demand during a pandemic.

However, Lewis also makes the observation that decisions can no longer be made purely on the basis of technical evidence and draws the book to a conclusion noting that “greater attention needs to be paid to how decisions might appear to a cynical public.”

There are some wild claims throughout, such as “The US invented pandemic planning in 2005”, which I’m not sure would stand up to much scrutiny. And I’m sure that trying to tell a history of COVID whilst we are all still living through COVID means there is more to be uncovered. But overall, The Premonition is an easy to read yet insightful book which casts light on, more often than not, the failings of government-level risk management and the commitment and passion of public health and emergency management professionals, noting that some are “so committed it’s more of a mission than a job.” 

 

Next on my reading list: Catastrophe and Systemic Change by Gill Kernick

Pandemic Anytown

Pandemic Anytown

Reading Time: 3 minutes

Remember that song you learnt at school… “the knee bone’s connected to the leg bone“? Well, that song tells us to think of the body as a system of interconnected and interdependent components which work together to form a whole. Make a change somewhere and the repercussions of that will be felt elsewhere.

Other metaphors are available: The butterfly effect. The domino effect.

For a whole host of reasons though, we often focus more on components over systems; and it’s important we do that.

It’s important that when we plug something into an electrical socket or turn on a tap that what we are expecting comes out.

But we should ask ourselves why that is important. It’s important because, owing to our highly connected modern society, when a component fails the cascading impacts of that can be felt far and wide. It’s not just inconvenient, it can sometimes have direct safety implications.

When an earthquake struck northern Italy in 2013, the NHS in the UK had a supply issue with dialysis tubing.

We’re seeing similar right now with the COVID pandemic. It’s not just the impact of people who contract the disease, but the far-ranging impacts and knock-on effects of social distancing and isolation, reduced international travel and changing perceptions of risk.

I started the ANYTOWN project in 2013 as an attempt to better understand and describe how a partial or total failure of a network can impact on other connected networks. In some circumstances, this can lead to a much larger range of impacts than just the initiating incident.

Previous blog posts about ANYTOWN cover a bit more of the background of the project. But I’ve been attempting to apply the same model to describe what we are seeing (and may see in the coming months) with COVID.

There is very little ‘real science’ to this. Previous Anytown work was informed by extensive focus group research. However, as this is a highly dynamic situation this is primarily my musings. I shared it on LinkedIn over the last week and I’m indebted to those who have made suggestions and offered feedback.

This is is a work in progress. It is biased towards my own experiences as a middle-class white man in his thirties in London. I appreciate that other people’s experience of COVID will be different. I want to reflect that in future versions, but at the moment it is a limitation that I have noted.

Here’s version 1.2 for you to explore…

Starting in the centre is the initiating incident, in this case, the pandemic virus. Although there may be some specifics to COVID I suspect many of the cascading consequences would be relatively similar across different global pandemic threats.

The next ring out from the centre describes the ‘first-order’ impacts that are/have been observed across a range of different sectors. So some of the first impacts that would be anticipated (and have played out with COVID) are the introduction of social distancing measures, reduced public transport use and increased handwashing.

Second and third-order impacts for each sector are then captured as you move further from the centre. The diagram deliberately doesn’t indicate timescales; I intended this to help understand sequence, not timing.

This is a bit of a thought experiment to see if the model would work having previously been geared towards ‘hard infrastructure’ systems failure. I think it does, but it needs some development. I’m incredibly grateful to those who have made suggestions (I haven’t checked that it’s ok to specifically credit them so acknowledgements to feature in a future version!) or have commented that they have found it useful.

It’s not the answer to the problem. Not by any means.

But hopefully, it’s a useful tool to help us all to think about how our increasing interconnectedness. Normally this is super helpful, but it can sometimes work against us. At a time when there’s lots of uncertainty about lots of things, perhaps this offers a bit of a glimpse into the future to help us be prepared.

What Jurassic Park taught us about cyber risk

What Jurassic Park taught us about cyber risk

Reading Time: 3 minutes

The tl;dr version of this post: don’t forget about the insider threat!

This week I attended the first in a series of three events by the Institution of Civil Engineers entitled Preparing London. This particular event was designed to consider the human threats to infrastructure.

During a talk from Nathan Jones (see this blog on his talk) my mind wandered and wondered…Did Jurassic Park teach me everything I know about cyber risk?

God damn it! I hate this hacker crap!

Ok, so maybe not everything worth knowing about cyber risk is summarised in Jurassic Park, but it’s a useful introduction into what happens when the tables are turned and technology which usually helps keep us safe, becomes the risk.

Everything in Jurassic Park is connected. The electric fences, the lighting in the visitor centre, the locks on the doors. When it’s working as planned, this connectivity helps the park’s management maintain an efficient operation and a positive guest experience.

However, such a complex system requires some centralised control.  Looking at this through a business continuity lens, this is a clear single point of failure. An inherent risk.

This has clear parallels with our modern society and the interdependencies between systems that I’ve talked about previously.

Dennis Nedry exploits his colleagues limited understanding to enact his attack. He uses his tech-savvy advantage to provide cover for him stealing intellectual property, whilst putting lots of people in danger. The ultimate lesson here is that the real monsters aren’t the dinosaurs.

Objects in mirror are closer than they appear.

As well as a light-hearted moment during the dinosaur chase sequence, I think Spielberg also snuck this in as a metaphor for risks manifesting in ways which had not been considered.

Were the Jurassic Park team aware of cyber risk? Yes, there is literally a scene about passwords. I expect a lot of  people assume that a good password is all they need for their IT security.

It’s clear they had also considered other risks, and had taken proactive action to control that risk. Electric fences, professional hunters, CCTV and motion sensors and the attempt at all-female genetic engineering are just some of the risk controls in the film.

But had the team considered the possibility that an employee would want to hold the park to ransom for personal gain? Could they have identified the vulnerability of the computerised control? Could they have done more in advance to protect the systems from malicious attack?

Dennis, our lives are in your hands.

Early in the film there are hints at Nedry’s personal financial difficulties. Later he mumbles to himself about test runs of his embryo heist.

John Hammond, the park owner recognises the power that Nedry has.

There were clearly signals which the team missed and knowledge which is combined, could have allowed an intervention before he got the opportunity to shut down the park.

Clever girl / I know this.

Just as the team hadn’t anticipated an insider threat, Nedry wasn’t expecting a tech-savvy teenager to thwart his plan.

Just when it looks like the raptors will get into the control room, Lex (the park owner’s granddaughter) recognises the Unix system and takes maters into her own hands.

The actual interface may be debatable (in researching (yes, research!) this post I’ve found that it was technically available, but I’m doubtful that a school student would have been aware), but it comes as no surprise that kids have a natural affinity with the technology that adults have to think about.

Side note: Provided the right precautions are in place to prevent unauthorised use, user friendly systems aren’t just a productivity win; they help prevent people finding work-arounds or backdoors.

Life finds a way.

With the ever increasing access to, and pervasiveness of the Internet and smart devices, Jurassic Park remains relevant today.

I’d argue that we’ve already reached a point where complete understanding of system interdependencies is impossible. Our societies and the technologies used are just too complex. However, we can continue to challenge our assumptions, keep our risk assessments grounded in reality and take action in advance to mitigate that risk.

It’s also a reminder that physical and IT security are just parts of the puzzle when it comes to risk management. Solutions are also required, sadly, to prevent against malicious attack by either insiders or outsiders.

It’s also just a really great film!

Have I Got News For You

Have I Got News For You

Reading Time: < 1 minute

Regular readers will be aware that I’ve been working on a project called Anytown over recent months. It’s a project looking at complexities and interdependencies between systems and how that can impact on resilience. Previous blog posts about it are herehere and here.

Rather excitingly, you can now find out about Anytown in Resilience, the magazine of the Emergency Planning Society. Yes, there is now a remote possibility that I will be quoted on BBC prime-time programming, having been published in a niche trade magazine! (I’m hoping Charlotte Church might make a reprise as host for ‘my’ episode….)

They saved the best until last, so head to pages 38 and 39 (but then take a look at the rest of the mag). Even if you don’t learn more about interdpendencies, you’ll have an edge over the missing words round, and everyone likes winning.

As always, comments appreciated either in the box below or direct to my inbox over here.

With thanks to colleagues at the EPS for publication

Anytown – latest visualisation

Anytown – latest visualisation

Reading Time: < 1 minute

Displaying complex or detailed information in a digestable way is always an interesting challenge. It’s certainly s certainly one of the challenges that I’ve had with Anytown, my project to better understand interdependencies and complexities within and between systems.

Here’s my latest attempt at showing some of this information, which I developed for an NHS England briefing today, using information from the Hurricane Sandy report “A stronger, more resilient New York.

Linkage

Nodes of different ‘city networks’ are shown in thematic colours (Gas, Electricity, Fuel, Water, Telecommunications and Wastewater). Please note that this is an illustration not a schematic of each network. The connections within networks are shown with black lines, and where there is an interdependency with another network it’s shown in red.

I’m now working on a way of using this alongside the previously developed ripple diagrams to better articulate interdependencies, ideally in an interactive way. If you have any thoughts on how this could best be achieved, drop a comment in the box below, or get in touch directly via Contact Us.

Anytown Unleashed

Anytown Unleashed

Reading Time: < 1 minute

For the last 4 months I’ve been spearheading a project known as Anytown. The project aims to help develop better understanding and awareness of how different ‘city systems’ all interlink. Today I unleashed my baby into the world at Defra’s Community Resilience & Climate Change Workshop. Read more on the project below.

When you throw a stone in a pond, ripples propagate from the centre. Similarly in emergencies and disasters, impacts of an initiating event can propagate and cause a cascade of consequences. There are many examples of this both in the UK and overseas, yet there has been little formal consideration of it to date.

The intention of Anytown is to simplify reality and model the interconnections and interdependencies between systems in order to provide a greater level of awareness of these potential impacts.

During my studies we had an assignment involving ‘Complex Cascading Disasters’ and I remember at the time, that there was little readily available research in this area. That situation hasn’t changed significantly so in February, I coordinated a number of workshops bringing together over 100 representatives from 52 organisations to discuss and harvest their knowledge and experience.

Looking back to my ripple analogy earlier, from the workshop data I created ‘ripple diagrams’ which demonstrate how consequences cascade from an incident through various sectors.

Anytown is now free into the world. This is exciting as one of the key aspects that I realised during the development is that a model is only as good as the information that feeds it – so now many more people have the opportunity to contribute. I’ll bring occasional updates on the progress of Anytown as I move from the model development (hopefully) towards visualisation and simulation.

The ‘work’ version of this post is over here

Complexity & Interdependency

Complexity & Interdependency

Reading Time: 2 minutes

tt19-Interdependency

I’m currently working on a project investigating Infrastructure Ecology, although that’s not how I describe it at work for fear of alienating the audience! It’s a fascinating area of enquiry, which the diagram above only partially articulates and I’d need more than one blog post to do it justice. So I thought I’d start with why I think it’s fascinating.

When we flick on a light switch, twist a tap or pick up our phone we expect those services to work. We’ve come to rely on them, and largely that doesn’t cause us any issues – the lights come on, water comes of ouf the tap and we hear a dial tone. However, incidents (Gloucestershire flooding 2007 and Hurricane Sandy 2012 to name just two examples) and exercises that I have either participated in or facilitated consistently reveal that these systems are far from 100% reliable.

Too often we treat things in silos, but increasingly we need to consider how the different systems that we have developed and have evolved alongside over many years interact and depend on each other. In a previous role, I facilitated a business continuity exercise for a large teaching hospital. The scenario was pretty basic, but it revealed that all but 4 of the wards in the hospital had planned to use the same fallback space – in the worst case this meant cramming over 200 patients into a 30 bed ward. We find it difficult to think outside of our sphere; I’m not sure of the reasons why, but we need to recognise that it happens and develop a methodology which forces us to think more holistically.

Interdisciplinary approaches are the way forward. Involving a wider range of people and organisations is risky – and makes camels a more likely outcome – but it’s the only solution to get us out of our silos.

Previous attempts to ‘educate’ professionals about these business continuity challenges concentrated on presentations, and as the same lessons are still coming us, I think we can be confident that levels of awareness have remained largely static. My approach has been to redefine the problem (that non-experts don’t understand interdependencies and complexities of systems) and to look for other world solutions (which is where the ‘ecology’ in Infrastructure Ecology comes in).

Experts in biodiversity have known for a considerable length of time that the key to understanding the key to successful interventions is understanding the underpinning relationships between predator-prey-environment. It’s something that I vaguely remembered learning at school, and without much thought it was clear that it was a model which had applications in helping understand the infrastructure problems encountered.

Last week I ran two workshops at City Hall, with representation from a wide variety of sectors organisations and interests to harvest their experience and knowledge. This will be synthesised to produce a model of an urban area which ‘understands’ how the different systems are related and therefore what the consequences of interruption to one will be on other systems.

I’m now in the process of translating the data we collected into something meaningful. I have some grand aspirations for the project, and alternative between getting carried away and reigning myself in to concentrate on the practical! I’ll keep you posted!

Image Source: NARUC