UK businesses are ill-prepared to cope with disaster and disruption, and many organisations are no more than “sitting ducks”, according to the Chartered Management Institute (CMI).
The body issued a report this week which finds that while most organisations have identified the main threats to their business, they have failed to address them; 86% of businesses know what problems they have, but only one in 10 do something about it.
The CMI report commented: “external drivers are putting increasing pressure on businesses to adopt continuity plans”. Of those that do have a plan in place, most are not tested adequately, “so there is no confidence in whether they’ll work if they’re invoked.”
Accompanying the report are a number of case studies – two of which are reproduced below.
Big Bang hits brewer
On Saturday 13 May 2000, a huge explosion occurred inside a warehouse at Enschede in The Netherlands. The entire district was destroyed, many houses were damaged – some of them beyond recovery. The production unit and administration building of the Grolsch Brewery was located 300 metres from the explosion. A part of the production unit was completely destroyed and the administration building, housing the computer area, was severely damaged.
Straight after the explosion, the police sealed the disaster area. Grolsch’s computer equipment kept running, and just one hour before all of the electricity in the area was cut off, an automatic backup was completed.
Because the wall protecting the computer area was standing at right angles to the direction of the air displacement, it caught the main impact from the explosion.
Thanks to this, the computer area and backup tapes were salvaged to a large extent. The authorities gave their permission for the backup tapes to be retrieved, and they were collected the next day by the company’s fire brigade.
The entire site at Enschede was out of use, and the area was sealed off. This meant the site’s recovery centre, which was integral to the company’s disaster plan, couldn’t be used.
By 9am Sunday morning Grolsch’s crisis managers had gathered in a restaurant. This quickly became a real crisis centre, from which all actions were co-ordinated to ensure the company’s business was to become operational again. It was immediately decided to transfer business activities to Groenlo, situated 30km from Enschede, where Grolsch’s second site is located. In Grolsch’s disaster recovery plan, Groenlo was the alternative option as a recovery centre.
A number of mobile offices were installed on the parking area of the Grolsch site to increase the number of working positions by 125. The mobile unit that served as a recovery area for the first few hours was emptied and all backup equipment was installed in a room inside the location.
Grolsch needed to return to its site in Enschede as soon as possible. However, it became clear that this would take up to eight weeks. At the time Grolsch had two production units, one in Groenlo and one in Enschede, which were responsible for one third and two thirds of the entire production capacity respectively.
At the time of the disaster, Enschede’s stock contained four days’ worth of production supplies. Although the supplies were not damaged, they could not be collected for a few days. To compensate for the lack of productivity, Grolsch made arrangements with some of its counterpart breweries, such as Heineken, Bavaria and Interbrew.
When the phones crashed
Skipton Building Society’s principal office, located upon a winding hill above the idyllic town of Skipton, is the seat of its call centre, teleservices and financial services departments, as well as four of its subsidiaries including Savings Management Limited, which operates a call centre on behalf of a third company.
All in all, five separate national call centres operate from the site, with the volume of calls dealt with on any given day fluctuating between 2,000 and 5,000.
For the telephone system to become inoperative, even for a day, would be a catastrophe. However, this is exactly what happened, when after having recovered from a fault the previous evening, the system went down again just before 9am and the staff’s arrival.
The telephones had been inconsistent the day before the incident, a BT engineer had been called out as a precautionary measure, but found all the telephones were working fine. Within 40 minutes of the engineer leaving, the telephones went down again until 23:00. The returning engineer fixed the telephones by 23:15 and they worked again until just before 9:00 the next morning.
For anything that should go wrong, in relation to business continuity and disaster recovery provision, Skipton Building Society operates a “major incident procedure” plan, in association with the local BT Service Centre, in Leeds.
This was described by network services manager, Anthony Halsall as: “An escalation path, to ensure that incidents are given the appropriate level of attention. Up to five pre-defined scenarios are listed, each one being used as a descriptor of the level of importance of the incident, number one being the most urgent.”
The engineer was on site again at 11:10 but by 12:30, Skipton’s network services team decided it was time to invoke the disaster recovery plan and called BT CommSure. Upon their request, BT also set up a recorded customer re-direction message, so incoming callers would not encounter dead lines.
After having decided where and how to lay the cables, the next step was to consult with the building services department about the security aspects of leaving doors and windows open to accommodate back-up cables. The easiest and most direct route to connect a BT CommSure mobile PBX unit to the central switch box was through a vent and along a corridor, via a window. The Skipton network services teams worked alongside CommSure in assembling and laying some of the emergency cabling.
To re-wire the telephone system, CommSure installed temporary connections onto the existing test-jack frame, using ‘combs’ to re-direct lines away from the faulty PBX system. Such combs provide an effective and rapid solution, used to isolate the temporary and permanent PABXs. The back-up system provided by BT catered for six times the capacity of the faulty system, said Halsall.
After the telephones were back up and running Halsall noted: “the backup system was kept here on standby for a few days, just in case, and did prove that we can recover from a major disaster within 24 hours.”
In this particular instance, Skipton Building Society did not have a plan for which phones would be connected with limited capability. The existing plans were based on a total loss of site, not just PBX switches. So the team had to invent one on the fly. Network Services managers found themselves dashing around the buildings asking which members of staff required priority connection.
A situation that could have resulted in operational disaster, with loss of sales and strains on customer and staff relations , became a learning experience for Skipton. Business continuity plans do not always have to be invoked as the result of extreme circumstances. In this incident, no staff were injured and the company was given the opportunity to test the efficiency of its systems. The company found that morale improved as a result of overcoming the situation through a team effort.
The BT CommSure installation left behind an enormous drum of cable, which enabled Skipton network services to create more than 200 spare connections that now act as a permanent back-up system. These are fastened to a PBX and ‘ready to go’, sitting in a separate frame; and hydra-like alternative switch-box locations exist throughout the buildings, so if one of them should go down, a different one can be used.
The incident helped the Disaster Planning Committee at Skipton formalise procedures and increase its level of cover. Emergency planning procedures found that out of 100 phones, only 92 were effectively necessary. In a building housing over 1,000 telephones, the potential cost-savings resulting from efficient use of resources was significant. The next step for Skipton now, is undertaking re-location planning.