Exclusive Insight: Visiting one of the Most Advanced Datacenters in the World

der8auer EN42 minutes read

A visit to the Urdif data center in Bielefeld showcased the intricate operations, including IBM Power servers, sophisticated cooling systems, and elaborate power distribution setups, ensuring minimal downtime and optimal performance. The tour highlighted the advanced technology and meticulous infrastructure in place, emphasizing energy-efficient practices and safety measures, with special thanks extended to the team for the insightful tour.

Insights

  • Urdif, a company affiliated with Dr. Oetker, operates two interconnected data centers in Bielefeld with advanced systems for data transfer, power distribution, cooling, and backup, ensuring high efficiency and minimal downtime.
  • IBM Power servers, such as the Power 10 with high power consumption and temperature requirements, offer exceptional memory management capabilities, necessitating specialized IBM technicians for maintenance tasks, showcasing the complexity and sophistication of these servers.

Get key ideas from YouTube videos. It’s free

Recent questions

  • What unique capabilities do IBM Power 5 servers have?

    IBM Power 5 servers in Bielefeld offer capabilities beyond standard x86 Intel AMD servers, known for their efficiency and performance. These servers are specialized for high-power computing tasks, making them ideal for data centers and businesses requiring robust processing power.

  • How does Urdif ensure data redundancy in Bielefeld?

    Urdif, a company affiliated with Dr. Oetker, operates two interconnected data centers in Bielefeld. These centers are linked through a ring bus system to prevent data loss in case of connection damage, ensuring data redundancy and continuity of operations.

  • What cooling systems are used in the Bielefeld data center?

    The Bielefeld data center employs sophisticated cooling systems to maintain optimal temperatures for server hardware. This includes cold air containment channels, water processing tanks with salt for efficiency, and air circulation coolers with filters to prevent dust accumulation, ensuring efficient cooling and performance.

  • How does the Bielefeld data center handle power distribution?

    Power distribution in the Bielefeld data center involves meticulous planning and setup. With 10,000-volt cables laid out to prevent crossings, electrical controls transforming voltage levels, UPS batteries for backup power, and a diesel generator for quick activation during outages, the data center ensures reliable and uninterrupted power supply to servers.

  • What safety measures are in place at the Bielefeld data center?

    The Bielefeld data center prioritizes safety with measures like smoke detection tubes, oxygen reduction systems, and nitrogen tanks for fire suppression. Internal air circulation and sensors ensure even distribution of air particles, while a complex infrastructure minimizes downtime for critical customers, showcasing a commitment to safety and operational excellence.

Related videos

Summary

00:00

"IBM Power Servers in Bielefeld Data Center"

  • The city of Bielefeld, 400 kilometers from Berlin, is home to IBM Power 5 servers, known for unique capabilities beyond standard x86 Intel AMD servers.
  • Urdif, a company affiliated with Dr. Oetker, a major German food processing company, specializes in IBM Power servers and invited a visit to showcase their operations.
  • Urdif operates two data centers in Bielefeld, interconnected for redundancy through a ring bus system to prevent data loss in case of connection damage.
  • Data enters the data center through fiber connections, including links to Frankfurt and the second Bielefeld data center, capable of 500 gigabytes per second transfer speeds.
  • Power distribution in the data center involves 10,000-volt cables, meticulously laid to prevent cable crossings and ensure data synchronization.
  • Water processing for cooling involves tanks with salt to clean incoming water, crucial for cooling systems' efficiency.
  • Electrical controls manage the transformation of 10,000 volts to 400 volts for power distribution, with tubes for smoke detection ensuring safety.
  • The UPS room houses batteries to power the data center for 20 minutes, with a meticulous setup to prevent acid leaks in case of battery failure.
  • A diesel generator provides backup power within seven seconds, maintaining a preheated state for quick activation and a large external tank for extended use.
  • Server hardware, including an IBM Power 9 server with 1408 CPU cores and 64 terabytes of memory, is cooled through a sophisticated system of cold air containment channels for efficient cooling.

15:26

Efficient Cooling and Safety Measures in Data Center

  • The room visited is for power distribution to servers, with hot air from servers being sucked back in for cooling.
  • Air circulation coolers with filters ensure no dust in the room, with air passing through radiators cooled by cold water.
  • Heavy floor tiles conceal pipes containing cold water for cooling the air inside the racks.
  • Cold water for cooling the air comes from a room marked with intake and outtake temperatures of 14 and 20 degrees Celsius.
  • Cold water pipes lead to the headquarters for heating in winter and cooling in summer.
  • Heat from the data center is dissipated through a massive radiator with fans, aided by spray cooling in summer.
  • The shadow side of the building is used for cooling due to lower temperatures, with plate heat exchangers and large tanks for temperature regulation.
  • Oxy reducts are used to reduce oxygen levels in the data center for safety, with nitrogen tanks for fire suppression.
  • Internal air circulation and sensors ensure even distribution of air particles for safety and fire detection.
  • The complex infrastructure ensures minimal downtime for critical customers, with servers like the IBM Power 10 having multiple nodes and a control unit, costing millions.

30:54

IBM Power 10 Server: High Power, Redundancy

  • The Power 10 server has high power consumption and temperatures, necessitating unrestricted airflow.
  • Cables connect each CPU in the server, allowing access to memory across CPUs.
  • IBM Power servers require an IBM technician for any maintenance work, even minor tasks like changing a memory module.
  • The server contains four PSUs with 1950 watts each, offering redundancy for two defective units.
  • IBM Power servers have a height limit of 24 units, requiring a ladder and safety helmet for taller builds.
  • The server houses four CPUs in a row, directly connected via cables from the back.
  • Memory modules in the server are OMI DIMMs with a capacity of 256GB each, totaling 16TB of memory.
  • The server is reassembled, and a technician initiates the startup process.
  • IBM Power servers excel in memory management, with instances having up to 14TB of memory and 400 CPU cores.
  • The data center's efficiency rating is 1.35, showcasing exceptional energy consumption optimization.

46:52

"Grateful Farewell to Tour Guides"

  • Gratitude expressed to Edif, Dominic, Chris, Marvin, and all involved for the tour
  • Appreciation for the time taken to show around
  • Farewell and thanks for watching the video
Channel avatarChannel avatarChannel avatarChannel avatarChannel avatar

Try it yourself — It’s free.