What can we learn from August 9th?
The summer of 2019 saw the disconnection of more than a million electricity customers in Britain's largest single electricity system disturbance in over ten years. Here, UKERC co-director, Professor Keith Bell, summarises the findings of the government and regulator investigations into the incident and unpacks potential lessons for the electricity sector and its readiness for further decarbonisation without compromising security of supply.
7th February 2020 by Networks

Loss of power
On August 9th last year, the power sector hit the headlines in a most unwanted way: shortly before 5pm that Friday, over a million electricity customers were suddenly disconnected. They were reconnected again within 40 minutes but not without some important services being affected.
These included trains in the southeast, particularly those that, as an investigation by the Office of Rail and Road (ORR) found, had disconnected themselves during the disturbance. Most of these could not then be restarted without a visit from a technician with a laptop.
As a result, the lines were blocked for many hours and rail services were disrupted into the next day. Not good at any time and certainly not on a Friday evening. The ORR investigation was one of three that was started in the aftermath of the incident. The others were by the Energy Emergencies Executive Committee (E3C), a body convened by the UK Government, and the electricity regulator, Ofgem. All three published their reports last month.
But what did they find and what else can we learn from what happened?
So, what happened?
The initiating event was a short circuit on a 400kV overhead line caused by a lightning strike. Such faults typically happen tens of times each year on the transmission network and do not cause losses of supply. On this occasion, however, the voltage depression caused by the short circuit fault caused an incorrect response by wind turbines’ control systems at Hornsea offshore wind farm leading to large oscillations of reactive power and the triggering of the turbines’ own protection – causing almost all of the power being produced at Hornsea to be lost. This should not normally happen.
The voltage depression apparently also caused a steam turbine at Little Barford combined-cycle gas turbine power station, owned by RWE, to trip. It is still not completely clear why this happened or why, to compound the problem, a gas turbine there also tripped a minute later.
A further, known issue made the event worse.
This concerned small scale generation – ‘distributed generation’ (DG) – connected within the distribution networks. Two types of ‘loss of mains’ protection, intended to safely shut down a portion of the distribution network when it becomes isolated from the rest, are known to be triggered by certain disturbances on the transmission system.
‘Vector shift’ protection is sensitive to short circuit faults and ‘rate of change of frequency’ (ROCOF) protection is activated by a large enough instantaneous loss of generation. Although the phenomena are known, what the ESO and the Distribution Network Operators (DNOs) have told Ofgem about how much DG was lost due to each of them on August 9th is based only on estimates. This is largely because, many years after significant volumes of DG started to be connected, the DNOs still lack detailed monitoring of it.
What were the consequences?
The loss of DG due to ‘vector shift’, the drop in power at Hornsea and the disconnection of the steam unit at Little Barford meant there was not enough generation to meet demand and the system’s frequency fell so fast it caused further DG to trip due to ROCOF.
‘Frequency containment reserve’ is scheduled by the ESO, mostly from generators but also from batteries, to respond automatically to an imbalance between generation and demand. However, there was not enough response to prevent the system’s frequency from falling so far below statutory limits that, within 80 seconds of the lightning strike, it triggered the disconnection of 1.1 million customers by automatic ‘low frequency demand disconnection’ (LFDD) equipment.
The E3C report notes that some ‘essential services’ were affected. Those disconnected by the action of LFDD included supplies to Newcastle airport, two hospitals, railway signalling at two sites, railway traction supplies at one site and a water treatment works.
“The E3C report notes further losses during the incident: traction supplies at two locations, railway signalling supplies at six sites, and supplies at two hospitals, one water treatment works and an airport”
These were all for as-yet unexplained reasons, although, for the most part, back-ups (such as standby generation) operated successfully. However, supplies to an oil refinery and chemicals manufacturing plant were also reported by E3C to have been disconnected with full operations taking several weeks to be restored.
Once the incident had started, was the impact inevitable?
The Grid Code, with which all parties connected to the electricity transmission system are obliged to comply, requires that generators should ‘ride through’ temporary faults on the network, such as that which initiated the incident on August 9th. Ørsted’s plant at Hornsea and RWE’s at Little Barford failed to do so.
Although individual infeeds can and do trip with at least one going pretty much every day, the Grid Code’s ‘low voltage ride through’ requirements are essential for preventing a simultaneous ‘common mode’ loss of large of amounts of generation.
According to Ørsted, the software on the wind turbines at Hornsea has since been updated so at least that particular control system failure should not happen again.
Generally, less well understood – and managed – than the behaviour of transmission connected plant is that of distribution connected generation, i.e. DG. At least some of this is known to trip as a result of disturbances on the transmission network though exactly how much depends on the nature of the disturbance. This compounded the problem on August 9th and was exacerbated by further trips that no-one seems to have been expecting.
Normally, the ESO schedules enough frequency containment reserve to cover for the loss of whatever the ‘single largest infeed’ happens to be at the time – reported by Ofgem to have been 969MW on August 9th. However, 1561MW was lost within less than half a second.
Nonetheless, it’s possible that the system frequency might not have dropped as low as the 48.8Hz threshold for the first stage of LFDD if all the scheduled frequency containment reserve had delivered. Unfortunately, it did not all deliver, perhaps as much as 17% of it in certain categories.
The disconnection of ‘essential services’ by the action of LFDD seems less than ideal: could at least the first tranche of LFDD (there are nine of them in total) have been set up on sections of the distribution network that fed no essential services?
Some essential services’ own protection, such as trains, also seemed to respond to the drop in system frequency and disconnected themselves.
What have the investigations concluded?
Ofgem reported that Ørsted and RWE have made payments of £4.5m each to Ofgem’s “voluntary redress fund” in recognition of the impact that failure of their plant had on consumers. One of the DNOs, UKPN, also made a payment, of £1.5m, in recognition of the adverse impact that its actions might have had when restoring disconnected demand before being authorised to do so by the ESO.
There is a programme of work within the industry to replace ‘loss of mains’ DG protection with settings that are less sensitive to transmission system disturbances. However, this programme is scheduled to be completed only in 2022. Ofgem has recommended a review of this timetable. It also recommended a review of various codes and standards including the Security and Quality of Supply Standard (SQSS) that sets out minimum requirements for how the ESO should manage the risk associated with disturbances to the system.
E3C made a number of the same recommendations as Ofgem. In addition, it promised to clearly define what ‘essential services’ are and provide guidance to those services, and to develop a new incident response communications strategy. It also asked the DNOs and the Energy Networks Association to undertake a “fundamental review” of the LFDD scheme.
The ORR recommended that train operating companies should check the settings of train protection systems and that Network Rail should check the nature of their connections to DNOs’ networks.
Of particular note, I think, is that Ofgem has identified a number of issues with the ESO’s existing processes and procedures. These include how the need for frequency management services is identified and the way they are procured. The responses of generation plant, to what is a not uncommon disturbance to the transmission network, were fundamental to what happened on August 9th.
Ever since liberalisation of the power sector, the ESO and its predecessor have had a role in policing the performance of large generators connected to the system although, according to Ofgem, it was relying on Ørsted to self-certify its plant at Hornsea.
Moreover, as was seen on August 9th, the behaviour of DG also has an impact on the system. An important recommendation by Ofgem is that arrangements for ensuring compliance of generation with relevant codes should now be reviewed.
What are the key learning points?
The August 9th incident was not caused by lack of power generation resources or the variability of wind power. Rather, it concerned the ways in which resources are controlled and the system is operated.
By international standards, electricity supply in Britain is very reliable and the event on August 9th was small, largely because LFDD succeeded in preventing the situation from getting a lot worse. However, the impact on rail users in the south-east was significant.
“One event cannot be taken as a sign of deteriorating system stability or of the complete inadequacy of procedures and conventions that have served us well for many years. However, there is no cause for complacency”
Britain’s supplies of energy need to be progressively decarbonised, the technical characteristics of the electricity system continue to change and the engineering needs to be got right. The costs of the transition need to be kept to a minimum but future electricity users will no doubt expect their supply to be, on average, as reliable as it has been up to now.
The power system has already been experiencing a growth in DG from around 7GW in 2009 to more than 37GW today but little of this is monitored and controlled. Failure to adequately manage the operation of DG represents both a threat to the system and the missing of an opportunity to use the services that DG might provide.
The DNOs’ readiness to become ‘Distribution System Operators’ (DSOs) was not shown in a good light by what happened on August 9th and, in respect of monitoring of DG, compares poorly to arrangements that are already in place in other countries. As Ofgem said, “substantial improvements are required in DNOs’ capabilities if they are to transition towards playing a more active network management role as DSOs”.
A major challenge, illustrated by the behaviour of the wind turbines at Hornsea on August 9th, is represented by the growth in use of power electronics on the system. The overall effect of this is currently not well understood and represents both an opportunity and a threat. The situation is not helped by the confidentiality that sits around the details of how power electronic converters are controlled and there is a need for research and well-qualified people to understand the interactions between different converters on the system.
This comes at a time when the UK Government’s main research funding agency has decided to cut funding to PhD students working on electricity system issues from its core Centres for Doctoral Training programme. Many of these students would have been expected to join the industry. Delivering a resilient system cost-effectively requires the right mix of operational decisions, control facilities, logistics and assets with the right specifications.
Engineering standards, clearly defined roles for the sector’s various licence holders and codes for governing the relationships between them are critical to getting both the engineering and the commercial relationships right among so many different actors.
“Responsibilities for ensuring electricity system resilience – preventing, containing and recovering from interruptions to supply arising from disturbances – need to be clarified and applied in a more rigorous way”
One particular suggestion by Ofgem is that “it may be necessary to consider standards for assessing explicitly the riskweighted costs and benefits of securing the system for certain events”. While not a new idea, this promises to allow actions by the ESO (and DNOs) to be better targeted but depends on the collection of good data on performance of all parts of the system, something that is not done to anything like the required level today.
Ofgem said that various issues uncovered by their investigation into August 9th have given them cause for concern about the way the system is operated. Unlike the investigations of Ørsted and RWE, their investigation of the ESO is continuing and will feed into the review by the department of Business, Energy and Industrial Strategy (BEIS) of arrangements for electricity system governance and system operation.
Is that a recognition that existing procedures for regulating the sector are inadequate, or that something has changed beyond just the growth of renewable generation?
So soon after the ESO came into being (in April 2019), might we see further major to changes to institutional arrangements for planning and operating Britain’s power system?
Comments
Login on register to comment
Related content

Gas
Cadent backs launch of major bio-CNG HGV refuelling station
Gas network’s £250,000 infrastructure investment ensures supplies to existing connected customers have not been impacted

Gas
Editor’s blog: The biggest tests of resilience are yet to come
Network content director Jane Gray reflects on the industry's coronavirus response to date and the challenges still to come.

Gas
From the front line: Chris Garside and Andy Simcoe, Northern Gas Networks
Key workers across the power and gas networks are playing a critical role in the national response to Coronavirus. Network has committed to profiling their stories.
Related supplier content
![‘Learning by doing’ on the road to net zero [test product]](https://networksonline.s3.amazonaws.com/products/images/3.jpg)
People & Skills
‘Learning by doing’ on the road to net zero [test product]
DSO director Andrew Roper discusses 'Learning by doing'

Power
Load patterns and lockdown: how Covid-19 is impacting electricity networks
Insights into dynamics on the low voltage network as the outbreak unfolds

Heat
How E.ON. is helping the City of London become a zero emissions city
Discover Citigen. Deep in the heart of our bustling capital