Placeholder Image

Subtitles section Play video

  • Welcome to a video tour of Google's first container-based data center.

  • There are slots for over 45,000 servers in the 45 containers housed inside. The

  • data center itself went into service in 2005 and supports 10MW of IT equipment load. It

  • has a trailing twelve month average

  • Power Utilization Effectiveness value of 1.25.

  • We start with the plan view, and

  • focus in on the equipment yard we will be visiting.

  • Here, we see the cooling towers, the

  • power distribution centers, and the generator farm.

  • Our first "action" shot

  • shows water flowing down the cooling tower fill. The

  • plant itself

  • is designed to maximize water-side economization through a combination

  • of elevated process water temperatures,

  • low-approach temperature plate and frame heat exchangers,

  • and a chiller bypass path.

  • As we swing over to the electrical yard, we see a medium-to-low voltage distribution center.

  • The transformer itself is not exotic,

  • but boasts a better than 99.5% efficiency,

  • as evidenced by the relatively small cooling radiators attached to it.

  • As part of the distribution center,

  • the transfer switches serve to connect the generators seen here.

  • Eventually, the

  • output of the distribution center is run into the building through these low voltage cables. Plant tour - Here we begin the plant

  • tour. Stepping inside the building, we again pick up the low voltage distribution lines.

  • Cable routing takes place in these standard trays that eventually head off for connection to the container switch panels.

  • We now step into the cooling plant.

  • As we pull back, we

  • get an overall view of the web of piping required to connect all the equipment together.

  • As the water returns from the cooling towers, a

  • portion of it is taken off and directed through a side-stream filtration system,

  • before being returned to the loop.

  • Continuing on, we

  • trace the piping down to the array of pumps where we see that we experience our share of leaks :)

  • Of course, when we talk about

  • cooling plant efficiency, the chiller uses the most power and is therefore our biggest target. And

  • an important part of reducing chiller hours is the use of plate and frame heat exchangers with their low approach temperature characteristics.

  • We conclude the tour of the plant,

  • with a shot of the thermal storage tanks

  • and the distribution piping that carry the water to the containers. Hangar Tour -

  • The hangar tour starts with a shot of the bridge crane used to move the containers

  • As we pull back, we

  • get a view of the single-story side, which

  • is more-clearly seen from a vantage point on the mezzanine. There

  • are 15 containers on the single story side and 30 on the opposing 2 stories.

  • Amongst the safety considerations in using containers,

  • we include emergency exit stairs and egress pathways as required by code at

  • the tail end of the

  • units. This final view gives an idea

  • of the scale of the hangar

  • bay. In this segment, we

  • follow one of our technicians to the

  • container, where he is dispatched to perform a server replacement.

  • Here we see him making his way on his Google-provided, “personal transportationdevice.

  • He must first verify the machine location.

  • As you see here, each

  • tray is outfitted with with onboard UPS running at greater than 99.5% efficiency. Now,

  • once inside, our

  • technician removes the tray blank and installs the repaired server tray.

  • The cold aisle inside the container is maintained

  • at approximately 27C, or 81F,

  • enabled by attention to good air flow management.

  • Here, we show our idea of Google nightlife :)

  • Another technician is entering the container to inspect the underfloor area.

  • As he lifts the floor grate, we get

  • a view of the flex hoses,

  • power conduit, and

  • fans used for cooling.

  • In fact,

  • there are many more system details designed into the container as shown in this isometric cutaway, and

  • this cross-section view detailing the tightly-controlled air flow path.

  • As we step out once more into the vestibule area,

  • we see the hookups bringing water into the container. On

  • the other side,

  • we find the electrical switch panel.

  • And inside the panel, we trace the cable inputs, into the breaker section, and

  • get to see some of the circuit transformers used in gathering our PUE measurements. All

  • in all, there

  • are many additional support systems required to satisfy the safety and occupancy requirements of the

  • container including

  • emergency off buttons,

  • fresh air supply,

  • fire detection,

  • fire suppression,

  • smoke dampers,

  • and much, much, more.

  • We end our video tour with another view of the hangar, and

  • a summary of the best practices incorporated into this data center.

  • Recapping, on

  • the electrical side we saw the use of high-efficiency transformers and UPSes,

  • along with the necessary circuit transformers for accurate PUE measurements. On

  • the cooling side,

  • we touched on many aspects designed to allow raising of the temperature inside the data

  • center and the associated reduction in chiller usage hours.

  • One last thing to note,

  • the best practices we've presented here

  • are the main reason we've been able to achieve our PUE results,

  • and can be implemented in most data centers today.

  • Thanks for watching.

Welcome to a video tour of Google's first container-based data center.

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it