Global Switch, the firm which took over the 23,320m2 building in 1998, provides internet service providers and data communications companies with secure premises for housing their hardware. Its London Switch centre provides numerous tenants with round-the-clock, fully resilient facilities and support.
The building had all the right credentials. Notably location – close proximity to the fibre optic backbone, and nearness to adequate power sources to supply the communication and air conditioning equipment. Its inherent structural characteristics, capable of supporting floor loadings in the region of 10kN/m2 and floor to ceiling heights in the region of 4m, made it even more appropriate.
Work on the refurbishment commenced in November 1998 with the official completion in October 1999 and RW Gregory and Partners was appointed as M&E consulting engineer.
The 24-hour a day, 365 days a year nature of the operation means that standby resilience on all the plant is key. Global Switch must guarantee that client areas will not be without power for more than 15 minutes. The size and power densities of the tenant areas vary, but all are supplied as integral fire compartments, with power, data and a chilled water supply for clients to fit-out as required.
However, due to the demand on space, it was decided to increase the building's W/m2 capacity. This resulted in the need for an extra transformer, two more generators and two additional chillers to handle the increased cooling load.
Building structure and layout
The distinctive three-storey building is clad in grey steel, with 12mm-thick full height glazing along its two longest sides. The tenant spaces occupy all three floors and totals over 20,000m2, with the ground floor being home to the offices, reception and cafeteria.
The main generator plantroom is situated behind the ground-level loading bay. This was another feature retained from its days as a printworks, and provides a means for tenants to bring equipment into the building. It also houses the bulk fuel storage and the two additional generators. The generators and the fuel store are contained in separate four-hour fire enclosures – a stipulation of the local authorities.
Directly above the generator plantroom is the high voltage and low voltage switch room. Above this on the second floor is the boiler room which also houses the air handling units. The chillers and chilled water plantroom are located on the roof.
Four metal-clad staircase towers are situated along the building's south side. Two at the centre make up the entrance which also incorporates the security pod, while the remaining two at either end act as fire escapes. There are also two fully pressurised fire-fighting staircases in the building.
The services engineering
The power consumption of the data equipment and the year-round refrigeration are the two biggest loadings. Initial consideration was given to letting tenants fit their own direct expansion or variable air volume systems, but it was found that the majority of potential tenants wanted a chilled water supply to connect to their own floor-standing air conditioning units.
Five air-cooled chillers have been installed in two areas on the roof, four to handle the load and one on standby. These provide a total of 4.9MW of cooling and deliver water at 6°C, with a return of 12°C, to each of the tenant areas. Design conditions for the switch areas are for a year-round temperature of 23°C.
A 600mm raised floor, which was primarily installed to accommodate the data cabling, has designated routes for the chilled water pipes. These are bunded with two courses of brickwork and a waterproof liner to contain any water in the event of a leak.
Tenants decide the layout of their spaces depending on their particular requirements. A typical arrangement is to have the data racks laid out perpendicular to the external wall, with floor-standing vertical close control air conditioning units running parallel to the wall. These discharge into the floor void and exit through floor distribution grilles situated between the racks. The air then returns to the unit via extract grilles in the plasterboard ceiling. A redundancy of one in 10 is used for the air conditioning units installed in the switch areas.
Solar gains from the fully-glazed façade were not considered to be significant – the north facing glazing has a U-value of 5.6W/m2K. Even less significant are gains from occupants, there are generally less than 100 people in the building with only transient occupancy in the tenant areas.
The heating load is almost non-existent and is mainly confined to background heating for the landlord areas. The three existing Hoval boilers have been refurbished, these are capable of providing almost double the 0.8MW heating capacity required for the building. They are used to heat the fresh air for the reception, canteen, corridors and offices. Radiant panels are used for heating in the loading bay.
Power is via two 11kV incomers from separate London Electricity sub-stations. Either of the incomers is capable of handling the 4.5MW load of the building, which is split almost 50:50 between the power needed to run the data equipment and that for running the air conditioning systems.
Four 11kV, 400V transformers have been installed to handle the electrical load. Tenants install their own switchgear from the three-phase distribution panel and provide their own ups capable of providing 30 minutes continuity of supply in the event of a power failure.
Five 1.25MVA Broadcrown diesel generators have been installed for back-up in the event of a power failure, a sixth unit is also in place should one of these fail. The generators can be up and running within four minutes and are capable of running for three days on the diesel fuel store. The flues for the generators pass up through the building and terminate at roof level.
New risers were installed throughout the building for the core services including smoke extract and pressurisation. Data cabling enters the building through a dedicated point of entry and is distributed throughout the ground floor and the first and second floors via risers.
What made this building suitable?
The London Switch centre is probably a fairly typical example of the building services challenges for telehotels. In a telehotel space is sold to tenants on its high specification infrastructure, reliability and round-the-clock support as well as the need for a high degree of redundancy. Interrupted service is unacceptable.
The emphasis is on business needs. Little consideration is required for any 'human factor'.
Advances in technological development of the data equipment, for example increased capacity of fibre optics, will probably be the major factor to influence the design of these buildings. Indeed, between inception and construction it is common for plant and services to be oversized by as much as 15 per cent to satisfy future requirements.
Space efficiency can be drastically reduced when rooms aren't quite the size required and structural characteristics can comprise the desired layout. Less than generous floor to ceiling heights, lack of space for plant and services, other building tenants to consider and compromised security are common problems.
All this makes the former Financial Times printworks building an inspired choice for refurbishment into a telehouse. It was fortuitously close to the fibre optic network but, importantly, it could carry substantial floor loads and provide generous floor to ceiling heights.
The building's innovative structure also played a part. The free-standing glass façade, supported by a network of external steel braces allowed extra floors to be installed where necessary. This was helped by the large open areas which were relatively uncluttered by structural columns and risers.
Data centre tips
Server farms are springing up everywhere. We asked Peter Walker of consultancy LAL what are the implications for the installation of air conditioning What is the difference between cooling an office and a data centre?
There are essentially two types of air conditioning systems today – one for cooling machine environments, and the other for cooling people. Cooling data centres involves the use of close control air conditioning systems. What are the typical machine-cooled environments?
Machine environments include data centres, web server farms (for internet, e-business and web transaction applications), ISP computer rooms and centres for telecom switchgear and computer network management apparatus. Organisations setting up such centres include facilities management companies, blue chips, large telecoms companies and financial conglomerates. What are the temperature requirements?
Temperature needs to be kept at 21°C, plus/minus 2°C, while relative humidity needs to be 50 per cent plus/minus 5. Machines are sensitive to these two drivers and cannot tolerate any rate of change. Costs?
Typically £250-£350/m2. Essential factors include cooling, humidity control and air quality i.e. dust filtration. Bear in mind most energy goes into the servers and is then transferred into heat. How reliable will my centre be?
All such facilities have to offer a minimum ‘six nines’ – 99.9999 per cent – reliability. The data centre must function with such reliability because it is carrying, switching or storing the electronic lifeblood of its customers’ business. What is the average life cycle of data centre air conditioning systems?
No two data centres are alike, but around 15-20 years – as long as optimal maintenance policies are pursued. Regular monthly visits are normally made to clean/check filters in the units. Locations are running continually for 24 hours, which means more stress on all the equipment. Peter Walker is the managing director of LAL which specialises in the design, supply and installation of air conditioning systems for data communications centres and ISPs.
Source
The Facilities Business
Postscript
This article is an extract from a building study published in Building Services Journal, February 2001.