You are Home   »   News   »   View Article

Review: Lord Cullen - what have we learned from Piper Alpha?

Monday, September 16, 2013

Lord Cullen, who conducted the 13 month government enquiry into the Piper Alpha disaster, used his keynote speech at the 2013 Oil and Gas UK conference to question how much the industry has learned since then. The inquiry made 106 recommendations, all of which were accepted by the industry.

His talk at the 2013 Oil and Gas UK conference to mark 25 years since Piper Alpha, aimed to 'give you some reflections on the Piper enquiry and look at them in the light of recent developments,' he said.

When starting the Piper Alpha enquiry, it was not obvious what direction it should take, Lord Cullen says: “I was asked to make observations and recommendations with a view to preservation of life and avoiding similar accidents in the future.”

There did not seem to be much point in limiting the analysis to the one specific accident, because this would only help prevent that specific accident from re-occurring. ““Major accidents are relatively rare – history does not repeat itself in the same fashion,” he said.

So instead, “I examined the significance of whatever had a tenable connection with the chain of events which led up to the catastrophe, and I also took account of other factors which played no part in bringing about the result, but which were in themselves cause for concern.”

Management
Before starting the enquiry, he expected that it would be concerned with hardware, he said: “To some extent it is. Take subsea isolation valves, which were lacking in Piper Alpha.”

“But I quickly realised the fundamental, and running through everything else, was the management of safety.”

“And as I dug down to the background of what happened, I discovered it was not just a matter of technical or human failure. As is often the case, such failures are indicators of underlying weaknesses in management of safety.”

“Management shortcomings emerged in a variety of forms. For example there was no clear procedure for shift handover. The permit to work system was inadequate. But so far as it went, it had been habitually departed from. Training, monitoring and auditing had been poor, the lessons from a previous relevant accident had not been followed through. Evacuation procedures had not been practised adequately.”

“There had not been an adequate assessment of the major hazards and methods for controlling them.”

For example, no-one had fully understood the implications of a high pressure gas fire, which would have consequences for the structure and integrity of the platform, for safety of personnel, and for the means of evacuation and escape. “The gas pipelines would take hours to de-pressurise – and this became a dreadful reality on the night of the disaster,” he said.

“These provided possible starting points for recommendations to the industry and changes to the regulation of safety.”

“I was conscious that no amount of regulations can make up for deficiencies in the quality of management of safety. That quality depends critically on effective safe leadership at all levels and the commitment of the whole workforce to give priority to safety.

“I saw those factors as intertwined with each other, and together making a positive learning culture and all that entails in the way of values and practises. It is essential to create a corporate atmosphere or culture where safety is understood to be and accepted as the number one priority,” he said.

“Management have to communicate this at all times and at all levels within the organisation. Most particularly by their everyday decisions and actions in tacking the issues which arise. They provide the opportunity for subordinates to see real practical substance. Leaders undoubtedly set the tone.”

These ideas were echoed by the board which investigated the loss of Space Shuttle Columbia and its crew in 2003, where the investigators had written, “If reliability is preached as organisation bumper stickers but leaders constantly emphasis keeping on schedule and saving money, workers will soon realise what is important and change accordingly. Be thorough and inquisitive, avoid leadership by PowerPoint, and question untested assumptions.”

Safety means “ensuring all the companies’ employees and contractors not only know how to perform their job safely but are convinced they have responsibility to do so,” Lord Cullen said.

Safety representatives
Regulations were introduced in 1989 that staff should elect safety representatives and their company should train them. “They are not part of the management but they have important functions, such as power to carry out investigations and reporting safety concerns to management, without fear of recrimination,” Lord Cullen said. “It helps reinforce the principle that each employee is responsible for his own safety”

The definition of the required training was “somewhat cryptically expressed – ‘Such functions as may be reasonable in all circumstances.’ The kind of words only a lawyer would use.” Lord Cullen said.
“In practise it seems this mean basic training in health and safety, the employers healthy and safety policy, and how safety representative should carry out their functions.”

Lord Cullen said he strongly supported the development of an OPITO industry standard for training safety representatives including investigating accidents and use of audit.

Process Safety
A theme of accidents over the past decade has been too much emphasis on personal safety (hard hats) and not enough on process safety (the accidents which cause the big disasters).

“The shortcomings on Piper Alpha represented failures on the part of management to give adequate attention to process safety, where the frequency of incidents is low but the potential consequences are very serious,” he said.

Similarly the (UK) Buncefield disaster in 2005 showed poor process safety. “An overflow of petrol led to the ignition of a vapour cloud and a massive explosions,” he said. “A monitoring gauge had been sticking for months without an effective response. A high level switch for closing down the flow was inoperable. It had not been locked in a working condition. Bunds for containing fuel were inadequately designed and maintained. A report published in 2011 stated that various pressures had created a culture where keeping the plant running was the primary focus.”

“The safety management system focused too closely on personal safety and lacked any real depth on control of major hazards. There should have been an understanding of major accident risk and systems designed to control them.”

Also in 2005 was the BP Texas City Refinery disaster, with a release of flammable liquid and explosion and fire. “The US said it was caused by deficiencies at all levels of the corporation. Cost cutting, failure to invest and production pressures had impaired process safety performance,” he said.

“The reliance on a low personal injury rate as a safety indicator had failed to provide a true picture of the health of the safety culture.”

“That disaster led to setting up a panel under James Baker III which looked at BP’s US refineries.
It said ‘BP emphasises personal safety but not process safety and did not set an appropriate process safety tone at the top.’”

“5 years later came Macondo. Among the many words that have been written on this disaster was a report by the Deepwater Horizon Study Group by members of Centre for catastrophic risk management.
Findings were strikingly similar.”
“It said BP’s system was geared to a ‘trip and fall’ mentality rather than being focussed on the big picture. It had been observed that BP forgot to be afraid.”

Auditing and Learning
For a safety system to work, “auditing is essential – and as far I am concerned it should be inquisitive auditing,” he said.

“On Piper there had been an audit of the permit to work procedure 6 months before the disaster. No deficiencies had been reported. The management assumed that in the absence of such feedback all was well, but the practise was very different.”

Once signs are spotted, you need to make sure people learn from them. “If you read the report of a major incident you will often see that it was preceded by the neglect of signs that all is not well,” he said.

“9 months before the Piper Alpha disaster, a rigger had been killed in an accident which was due to the members of the night shift improvising in the course of a lifting job without an additional permit to work, and to them not receiving adequate information from the day shift. After that incident management took some steps but they were not followed through.”

“I recall a chief process engineer from Piper saying in the course of his evidence that there were always times when it was a surprise that you found some things were going on.”

“In Buncefield there were signs that the equipment was not fit for purpose but nothing was done apart from temporary fixes.”

“Warning signs in Texas City refinery had been [ignored] for several years.”

The risk control systems give warnings about problems which don’t themselves escalate to into major incidents.

Communications
Communication is critical in many areas of safety management – from shift handovers to company communications about safety policy.

The James Baker inquiry into the Texas City Refinery said that corporate managers and refinery workforce should both understand the importance of process safety, and BP’s corporate management must clearly and frequently and consistently communicate that value.

“During its review, the panel found little to indicate that - before March 2005 - BP corporate management had effectively demonstrated its commitment to process safety, either through its communications or through a regular presence at US refineries,” Lord Cullen said.

“Communication is no doubt especially demanding in the offshore industry. It is normal that different workforces have to work together and that they are doing so in isolated and demanding environments.”

“The commission which investigated the blowout at the Montara well head platform in the Timor Sea in 2009 found that a contributing factor had been a systemic failure of communication between the owners and the rig operators – and between rig and onshore personnel of both companies.”

“The rig operators were ultimately responsible for rig safety, but when it came to certain critical procedures it was the owners that were calling the shots. Rig personnel were oblivious to flawed decisions taken by the owners but were going along with them.”

“The commission observed that communications between owners and operators needed to be more formalised with explicit sign-off on importance decisions affecting safety, well integrity and the environment.”

“8 months later we come to Macondo again where the National Commission observed [afterwards] that BP and other operators needed an effective system in place for integrating the various corporate cultures, internal procedures and decision making protocols of the many different contractors involved in a deepwater well.”

Making it happen
“Many companies have safety slogans such as absolute safety and zero accidents. Piper was no exception,” he said.

“The Baker Panel said, “BP has the aspirational goal – no accident, no harm to people – but it appears that refinery managers have had no guidance from corporate level refinery management as to how to achieve that goal.”

Safety Cases
Lord Cullen spoke at length about safety cases, which is a document, or ‘structured argument’, showing that a system is as safe.

Safety cases had been required for onshore management of hazards since regulations introduced after the Flixborough disaster of 1974, an explosion at a chemical plant where 28 people were killed and 36 seriously injured.

However in 1987, the UK’s Department of Energy advised against introducing a safety case regime offshore. This proved to be “a serious setback for development of the offshore regime,” he said.

“Onshore the hazards were serious enough. Offshore they were compounded by the isolation of installations, concentration of the workforce on or near them, unpredictability of the weather, and the fact that in the event of an emergency immediately protection for workforce had to be provided.
Conduct of one set of employees might affect that of others.”

After Piper Alpha, Lord Cullen recommended an offshore safety case regime, which would include identification and control of major hazards, safety management systems, temporary protection for crew in the event of an emergency, and full evacuation and rescue.

“I said it was an important component of the regulatory regime. The safety case should include provision of how safety should be achieved, covering both operators and contractors,” Lord Cullen said.

“The requirement for safety cases is no doubt demanding, for operators and for those who have to discharge a regulatory function. “It calls for expertise, vigilance and resources. It means a thorough assessment of risk, asking and answering the ‘what if’ questions.”

Lord Cullen quoted the report by Sir Charles Haddon-Cave on the Nimrod aircraft disaster in 2006, causing 14 deaths of RAF personnel. The plane caught fire after a routine mid-air refuelling manoeuvre

Sir Charles had said that the MOD safety case for Nimrod “was riddled with errors, a story of incompetence and cynicism. It was fatally undermined by a flawed assumption. It was seen as one of proving something which everyone knew as a fact, that Nimrod was safe. This attitude was corrosive,” Lord Cullen said.

“A company which is competent to operate an offshore installation should be competent to produce a safety case,” Lord Cullen said.

“The involvement of the company’s own personnel [to put together the safety case] is the best way to obtain the full benefit within the company – and for the purpose of dialogue with regulators,” he said. “They should be in a position to contribute to the production, review and revision of safety case.”

“On the other hand, consultants have a role in bringing an independent perspective and novel techniques.”

“Some people see the preparation of safety case as little more than a paper exercise, he said. “To my mind that would be to misunderstand its value.

“Focussing on safety in a systematic way may reveal gaps in safety protection. It provides a learning opportunity. It can enable senior management to communicate their safety strategy. It can assist the workforce to understand the rationale for systems and practise. It should assist in making improvements.”

“This pre-supposes that those who should have the information from a safety case can find it and understand it.”

“As I understand it, the typical safety case is extensive – and due to the need for it to me technical robust, much of it is complex. That can be a problem.”

“The Maitland panel [which looked into the UK offshore safety regime after Macondo] said safety cases should be living documents central to the way facilities are operated, with contents widely understood.”

“A safety case should reflect the organisation’s safety culture. If that culture is sound and healthy – it should show.”

Author: Karl Jeffrey
Company: Digital Energy Journal


Join 10,000 oil & gas professionals who attend our events and receive our newsletter.

FEATURED VIDEO

The Microsoft Upstream Reference Architecture ((MURA) initiative
by Paul Nguyen from Microsoft Oil and Gas


"Took the opportunity, to talk to Robert Price about fractured basement and to open discussion as to why is the PI 160 STB/D/PSI: possibly build the case where the basement is in dynamic communication to new YTF onlapping reservoirs."

John Wood (Wood Geoscience Limited)

KEY SUPPORTERS

Designing software for expert use
how we can build software specifically around the needs of the expert workers in our industry
London, 24 Jan 2019
Free for a limited period