Lost Password

News

Data Center
Educational Article

Rethinking Data Center Design to Maximize Efficiency

NEW WHITEPAPER explains why efficiency-optimized data center design and operation has never been more critical

An environmentally friendly data center is always a cost-effective data center.

Simplex Modular Cleanrooms & Separation Systems are now part of Subzero Engineering’s industry-leading product line for mission-critical environments. “Simplex” will become the branded name of Subzero’s Modular Cleanroom & Separation System line of products.

An environmentally friendly data center is always a cost-effective data center.

This White Paper will answer these questions

  • Why are both large hyperscale and smaller data centers moving towards this layout? 
  • What makes this design simpler and more versatile? 
  • How does design effect airflow and cooling optimization? 
  • What makes it energy efficient and sustainable?  What about equipment reliability? 

And provide insight on the following topics

  • Raised Floor Versus Slab Floor
  • When To Use Raised Floors
  • When To Use Slab Floors
  • Slab Floors Simplify Cooling IT Equipment
  • Containment Options
  • Maximizing Energy Efficiency & Sustainability
Cleanrooms
Press Release

A European modular cleanroom launch is on the horizon

Featured article: Interview with Simplex Isolation Systems

Simplex interview is featured in CleanroomTechnology.com

With a European modular cleanroom launch on the horizon, US-based Simplex Isolation Systems’ Director of Sales talks about trends in the cleanroom design & build sector and his hopes and plans for the company to respond to them.

Cleanrooms
Educational Article

What You Need to Know When Considering a Cleanroom

There are numerous considerations when evaluating a cleanroom.

Does your business plan include the development of an area in your plant for clean manufacturing? Are you concerned that you make the right purchase? Do you want to make sure you consider the right factors when you evaluate different models?

There are numerous considerations when evaluating a cleanroom. This article covers the basics.
You may very well need to employ the services of a cleanroom consultant to help you. Here are a few things to keep in mind as you start to consider cleanrooms and controlled environments.

1. What’s the application? 

Better quality or better yield is the primary reason for investing in a cleanroom space. It goes straight to your bottom line. 

Numerous manufacturing procedures now require a controlled environment in which you limit the amount of dust and dirt in the area of the manufacturing. Medical instrument manufacturing and packaging, electronics and computer manufacturing, food preparation and some military applications are but a few of the instances that have strict requirements for maintaining a clean environment. You need to know the requirement for your specific product or process. If the product you are manufacturing is regulated by a government agency, or you are contracting with a private firm that requires a certain level of clean manufacturing, they should have the cleanroom standards already documented. Check with them first. 

There are different levels of cleanrooms. ISO—the International Standards Organization, ranks cleanrooms ISO Class 1 (the cleanest) through ISO Class 9. The lower the ISO rating, the cleaner the environment. 

There are different levels of cleanrooms. ISO—the International Standards Organization, ranks cleanrooms ISO Class 1 (the cleanest) through ISO Class 9. The lower the ISO rating, the cleaner the environment. Measurement of contamination is done in “parts-per-cubic-meter.” An ISO Class 6 cleanroom, for example, is rated at 35,200 parts per cubic meter. That means the room can have no more than 35,200 particles greater than .5 micron in size per cubic meter. These are particles that are not visible to the human eye. (As a comparison, a particle of cigarette smoke is between .5 and 2 micron in size. The end of a piece of human hair is about 60 to 100 microns in size). 

Particle counts are performed at the work surface height. The pre-filters remove the dirt and dust you can see (call them baseballs and boulders). HEPA filters capture the particles you can’t see with a human eye. A light manufacturing area (defined as an environment that is not generating smoke or oil mist, such as storm window assembly and packaging) with pre-filtration on a HVAC system might be equivalent to an ISO Class 8 room, with 3,520,000 parts per cubic meter that measure greater than .5 micron. This is comparable to room air. 

Again, know what the manufacturing requirements are. 

2. Know the basic principles behind how a cleanroom works

The majority of cleanrooms, easily more than 90%, are positive-pressure rooms—designed to keep contaminants from entering the room. Air is introduced into the cleanroom, typically at the ceiling level, after passing through a fan-powered HEPA filter that removes particles as small as .5 microns. This creates a pressurized room in which the air pressure in the room is greater than outside the room— hence, positive pressure. The air, and the contaminants in the air, are then pushed down towards the floors, and ultimately pushed out vents in the lower portions of the walls of the room. 

This means that air and contaminants from the processes in the room are constantly flowing out of the room. In addition, the air exiting the room, either through vents or when doors are opened, is at a pressure sufficient enough to prevent contaminants from entering via those openings. 

Negative-pressure rooms are designed to keep contaminants from leaving the room. A negative pressure room is used in instances of infectious diseases and pathogens, bio-contaminants and some hazardous processes using chemicals, flammables and potentially explosive liquids and powders. Your concern is not what gets into the room, but what gets out. 

In a negative pressure room, air is pulled out of the enclosure through reversed HEPA filters, creating a negative pressure inside the room (which prevents contaminants from leaving the room), while air is constantly being drawn in through venting and other openings. The force of the air entering the room prevents contaminants from escaping. 

Because they are so prevalent, for purposes of this white paper we will be discussing positive-pressure rooms. 

3. ISO standards are the industry norm for rating cleanrooms. 

ISO standards were adopted by the industry in 2001. If you do any serious research into ISO standards, you are likely to come across the Federal Standard 209E for cleanrooms, which was the industry norm until ISO standards were developed. The federal standards were officially cancelled by the US Department of Commerce in November 2001, but they are still widely referenced. 

“ISO classifications expanded the horizon for classifying clean space,” according to Richard Matthews of Filtration Technology, Inc., in Greensboro, NC, who chaired the ISO board that developed these standards. Here are the Federal standards and their ISO equivalents. Note that ISO created three new levels that the Federal standard did not address. 

ISO Class 9 = No comparable Federal standard 

ISO Class 8 = Federal standard Class 100,000 

ISO Class 7 = Federal standard Class 10,000 

ISO Class 6 = Federal standard Class 1,000 

ISO Class 5 = Federal standard Class 100 

ISO Class 4 = Federal standard Class 10 

ISO Class 3 = Federal Standard Class 1 

ISO Class 2 = No comparable Federal standard 

ISO Class 1 = No comparable Federal standard

Whenever possible, refer to the ISO standards because they are internationally accepted. If you are dealing with partners in other countries, this will make issues much simpler. 

4. It’s all about air changes per hour, sometimes

With some exceptions, a cleanroom is a cleanroom is a cleanroom. Achieving a cleaner class of cleanroom is all about airflow. It’s a matter of bringing clean air in through HEPA filters in the ceiling and moving contaminated air out through vents in the walls or floors. The greater the number of HEPA filters and vents, the greater the rate of air change. You can see from this chart what the requirements are: 

ISO Class 9 through ISO Class 6 rooms are determined based on air changes per hour. ISO Class 5 through ISO Class 1 rooms are based on the flow of air through the room in meters per second. How are you getting the air into the space and how are you pushng it out? It’s easy to clean the air. The more difficult question is: Are you moving the air out of the area properly? Where does air enter the clean space and how conveniently is it moved out, carrying with it the contaminants from the manufacturing process? Placement of work tables, chairs and equipment becomes more crucial. An item incorrectly placed creates “dead space” where particles are trapped. 

These environments are prevalent in micro-electronics and will play a big part in nanotechnology as we get more involved in that industry. At these levels it is very important that you consult with a knowledgeable expert. 

5. Every cleanroom has three different states or conditions

Cleanrooms are measured for particulate count at three different levels: As Built, At Rest and Operational. As Built refers to the cleanroom as it is when it is built, empty of any equipment, materials or workers. Cleanrooms are typically certified by the manufacturer as to their level of cleanliness at the As Built level. At Rest refers to the cleanroom once equipment, machinery, furniture and product have been moved in, before the workers. All these elements bring with them sources of contamination, so you can expect a change in particulate count. Finally, there is the Operational level. This refers to the cleanroom when all the equipment and materials are moved in and there are workers in there performing tasks. Again, you are likely to see a change in particulate count. Machinery that performs an excellent function outside of a cleanroom can be a source of pollution inside a cleanroom. Metal shavings, gases and oil mist from pneumatic machinery and vapors from outgassing plastics are all sources of contamination. 

Any process that involves friction or movement is going to cause particulation. In order to prevent a drastic increase in contamination at the operational level it is important to adhere to proper protocol and procedures—things like gowning, house cleaning and workflow. If your process must involve machinery, then it may be necessary to further isolate that machine within the cleanroom in order to prevent it from contaminating the rest of the operation. A common solution is to build a cleanroom within the cleanroom using either partitions, or more commonly curtains, to isolate the machine. It may also be necessary to vent that area separately from the rest of the room. 

This change in particulate count from As Built, At Rest and Operational levels is another reason that you need to involve a cleanroom expert early on in the process. 

6. Modularity 

Things change. You can count on that. A cleanroom with a modular design allows the original layout to be expanded without having to rebuild from scratch. Your need for clean manufacturing space will increase or expansion will dictate that you move to larger facilities. Modularity in a cleanroom is important. With a modular design you can, with ease, expand the size of your cleanroom as your needs increase, without having to toss out part or all of your original cleanroom investment. And in the event you move to a new facility, you can disassemble your modular cleanroom and take it with you. 

There are also cleanroom designs that incorporate casters so that the enclosure can be easily moved around your factory floor. An example where this might be applicable is an injection mold facility where the manufacturing area is ISO Class 9, but production has an order for an I.V. component that requires an ISO Class 7 environment (an I.V. system has to be manufactured in at least an ISO Class 7 cleanroom). You can create this environment on the factory floor by enclosing the injection-mold machine in a portable cleanroom outfitted with casters and a HEPA unit installed in the ceiling grid. Simply roll the enclosure to the appropriate machine and attach softwall curtains to contain it. 

Another advantage to a modular cleanroom is in its tax benefits. A modular cleanroom can be written off in seven years. An existing room within your facility that is transformed into a cleanroom—referred to as “stick-built”—has to be written off over a much longer period of time. 

7. Envision future plans

Don’t make the mistake of trying to get along with a minimum of cleanroom space. You will be surprised at the speed your cleanroom needs increase. Better to plan for too much space than not enough. 

8. Don’t underestimate air conditioning needs

You might start with three workers in your cleanroom, and then find you need to increase to five or six. All that extra body heat, as well as any heat-producing machinery in your clean area, and that cleanroom quickly starts getting hot and uncomfortable. Also take into consideration that basic cleanroom clothing includes a hair covering, booties and a smock. Anything cleaner than Class 7 requires additional safeguards—masks, beard covers, goggles, etc. It is better to err on the side of too much when planning for air conditioning. 

This is where you get into the difference between single-pass and recirculating rooms. A single-pass room is a simple design in which air is pumped into the room from the top and blown out vents at the bottom. If you have to air condition your cleanroom, then you don’t want to just blow that expensive air-conditioned air through the cleanroom and out into a warehouse or other environment where it does little good. A recirculating design uses a double-ceiling system (the space between the ceilings is called a plenum) or a double wall (the space in the walls is called an air chase), or a combination of these two. Cooled, clean air is introduced through the HEPA filters and then flows out of the room, carrying with it any contaminants, into either the air chase or the ceiling plenum, where it is reintroduced into the room after once again passing through the HEPA filters. 

When you plan and budget for your cleanroom, consider that installation of a recirculating system for that enclosure is going to be at least 50% of the installation costs. If it is possible to place the cleanroom in an area where air conditioning is already in place that will be a huge money-saver. 

9. Not considering all that is needed 

It pays to bring in an expert early in the planning process. They can help you troubleshoot airflow problems, the types of testing procedures you must employ, and how to develop cleanroom protocols. A knowledgeable professional will point out things you will not even consider. 

It pays to bring in an expert early in the planning process. A knowledgeable professional will point out things you will not even consider

10. Think about process flow 

Give some thought to the workflow in your cleanroom. You want materials to come in one end and exit the other, and in the meantime, completing all the necessary assembly and packaging that is needed. The goal is not only to improve productivity and yield but to maintain or improve the speed of the manufacturing process. Again, this is an area where you should employ a consultant. 

11. The need for interior isolation. 

Every cleanroom ISO Class 7 or cleaner should have an anteroom for gowning, set off from the larger cleanroom with softwall curtains at least. This keeps street dirt from getting into the clean area. Interior isolation is also important in food processing and pharmaceuticals to prevent cross-contamination. A recent Simplex project called for dividers for a vitamin processing operation. If vitamin B12 meanders into the Vitamin C work area, that’s cross-contamination—a huge problem. Cross-contamination can mean manufacturing shutdowns and product recalls and lost profits. 

12. Clearance issues

People are often eager to use as much of their space as possible. It is a good idea to give yourself some extra room overhead between the outside ceiling of your cleanroom and the ceiling of your building—Simplex recommends three feet—to allow you to change out the pre-filters, HEPA’s and ULPA’s without a big hassle. The other issue to consider is that without enough clearance, a minimum of six inches, you run the risk of starving your filters for air. 

13. More need for clearance 

Regarding installation of the room itself, three feet is important all the way around the side walls. It will make the installation easier, as you will have more room to work with. The common space solution is to place the cleanroom against the walls and this is often done. This is great for maximizing the footprint but will make the install more difficult. 

The performance of your cleanroom hinges a great deal on the quality of the assembly. Look for a company that supplies the approval drawings with layout and elevations, and also shows the HEPA and lighting layout. 

14. Look for a cleanroom with extensive plans and instructions

The performance of your cleanroom hinges a great deal on the quality of the assembly. Look for a company that supplies the approval drawings with layout and elevations, and also shows the HEPA and lighting layout. Look for a manufacturer that supplies installation drawings with every panel marked with a letter or number that corresponds to the panel or part on the drawing. For large cleanrooms and critical applications you might need to consider a specialty contractor knowledgeable in cleanroom construction. 

15. Know the difference between a cleanroom and a clean zone and be clear in what you need 

Cleanrooms are areas in which the particulate count is measured and controlled. There is a requirement for a specific level of clean. A clean zone is an area within an existing cleanroom that is even cleaner, much like the injection mold operation cited in #4. 

There are also areas that a manufacturer may want to isolate, but not have to maintain at a certain ISO level. A recent project Simplex completed involved a manufacturing and warehousing facility where inventory was stored adjacent to the manufacturing area. Dust from the manufacturing process was drifting into the storage area. Although the product was not required to be clean when it was shipped, the manufacturer wanted it to be free from excessive dust—for aesthetic reasons. The goal in this case was isolation, not making the environment clean in the sense of a cleanroom. Simplex installed 300 running feet of industrial opaque vinyl curtain and two industrial strip doors that created a clean zone where product could be stored. Inventory could be shipped without the need for a second cleaning. 

16. Establish cleanroom operating procedures and have them documented

Make sure your employees read them, are familiar with them, and follow them—always. The single biggest source of contamination in a clean area is—you guessed it—the worker in that area. If your cleanroom requirements call for gowning and booties, then no employee should ever enter that space without them. To do otherwise contaminates your workplace and sacrifices the integrity of your manufacturing process. 

NOTE: Maintaining clean environments can be a complicated and critical task. The information provided here is meant to give the reader a basic understanding of the issues involving the selection of a cleanroom. Simplex recommends that you always consult and work with a cleanroom professional when implementing any sort of isolation procedures in your workplace, laboratory or hospital, thus ensuring that you maintain the highest standards. 

Simplex Isolation Systems wishes to thank Richard Matthews of Filtration Technology, Inc., in Greensboro, NC, for his input on this white paper. 

Data Center
Educational Article

Key Steps to Optimize Data Center Cooling Performance

Article Featured in CIO North America Issue 35 (page 66)
By: Gordon Johnson – Senior CFD Manager

One of the most important things data center operators can do to reduce energy costs is to make the cooling system as efficient as possible, especially since next to the IT equipment, cooling is the main source of energy usage and potential wasted energy in the data center.  Optimizing cooling performance reduces cooling costs and OPEX, therefore this should be on the top of the list for anyone looking to reducing annual energy costs their carbon footprint while becoming greener and sustainable.

One of the most important things data center operators can do to reduce energy costs is to make the cooling system as efficient as possible, especially since next to the IT equipment, cooling is the main source of energy usage and potential wasted energy in the data center. Optimizing cooling performance reduces cooling costs and OPEX, therefore this should be on the top of the list for anyone looking to reducing annual energy costs their carbon footprint while becoming greener and sustainable.


With today’s average rack power density at between 10 to 11 kW, many data centers likely have unused or even wasted cooling capacity in their air-cooled data center, especially if they’re not practicing a comprehensive airflow and rack management strategy such as installing containment and practicing good rack hygiene (blanking panels, sealing underfloor gaps, etc.).


After airflow and rack management has been implemented, it’s time to raise the temperatures (cooling optimization) and lower the airflow to the IT equipment (airflow optimization). In terms of efficiency and energy savings, most experts agree that a 1DF (0.55DC) increase in supply temperature results in approximately a 1.6 to 2% savings in the overall cost to operate the data center. With this in mind, there’s absolutely no reason why everyone should not be operating their supply temperatures as close to ASHRAE (American Society of Heating, Refrigeration, and Air-Conditioning Engineers) recommendation for server inlets of 80.6 DF (27DC). Not doing this is simply the biggest overall waste in efficiency and TCO (Total Cost of Ownership) for data centers.
And that’s not all. Along with raising supply temperatures results, most newer data centers are equipped with air-side or water-side economizers which results in even more energy savings, because the higher set point temperatures increase the amount of time that outdoor air can be used for cooling.


The second step to improve energy efficiency is airflow optimization since we only want to supply the amount of airflow (CFM/CMH) that our IT equipment needs, typically we want 10-15% more supply airflow to ensure we maintain positive pressure in our cold aisle(s). This practice will reduce bypass air which is wasted supply air from the cooling units that is not contributing to the overall cooling of the IT equipment. It’s an expensive problem because it costs money to both cool and blow the supply air, so we need to ensure that it’s going only going to the server inlets in our racks.
To achieve the most savings from airflow optimization, cooling units should be equipped with either VFDs (Variable Frequency Drives) or EC fans to maintain the 10 to 15% higher supply airflow versus demand airflow in the room. Many new data centers are using the “flooded room” design to achieve large savings with airflow optimization. By using perimeter cooling such as CRAHs, fan walls, etc. to flood the entire white space, they avoid having to match floor tiles or individual cold aisles to airflow (CFM/CMH) requirements for various rack power densities (kW).


In conclusion, major colocation and hyperscale data center operators have long realized the importance of airflow management which starts with containment, because it enables them to achieve large energy and efficiency savings via airflow and cooling optimization. As rack densities continue to expand this will become even more important if we’re going to get more sustainable as an industry and take care of our natural resources. After all, the best energy saved is the energy we don’t consume in the first place, and this has never been truer than in our industry.

Data Center
Educational Article

It’s Time to Rethink the Concept of Micro Data Centers

Article Featured in Inside_Networks Magazine Page 14-15, Letter to Editor
By: Andy Connor, Director – EMEA Channel

Data centres that are designed to meet the needs of standard or enterprise business applications are plentiful. Yet flexible and user defined data centres for edge applications, which rely on dynamic real time data delivery, provisioning, processing and storage, are in short supply.

That’s partly because of the uncertainty over which applications demand such infrastructure and over what sort of timeframe. However, there’s also the question of flexibility. Many of today’s existing micro data centre solutions meet a predefined concept of edge or, more accurately, localised, low latency applications, which also require high levels of agility and scalability. This is due to their predetermined or specified approach to design and infrastructure components, often led by the vendor.

To date, the market has been met with small scale edge applications, which have been deployed in pre-populated, containerised solutions. A customer is often required to conform to a standard shape or size and there’s no flexibility in terms of their modularity, components or make-up.

One might argue it comes with the subjective nature of edge computing, which is often shaped to support a vendor defined technology. Standardisation has also been beneficial for our industry, offering several key advantages including the ability to replicate systems across multiple locations. But when it comes to the edge, some standardised systems aren’t built for the customer – they’re a product of vendor collaboration. This is also accompanied by high costs and long lead times.

On the one hand, having a piece of pre-integrated infrastructure with everything in it can undoubtedly solve some pain points, especially where deployment is concerned. But what happens if the customer has their own alliances, their own definition of the edge, or may not need all of the components? What happens if they run out of capacity in one site or need a modular system that scales?

Then those original promises of scalability or flexibility disappear, leaving the customer with just one option – to buy another container. One might consider that rigidity, when it comes to standardisation, can often be detrimental to the customer. The point here is that when it comes to micro data centres, a one size fits all approach does not work. End users need the ability to choose their infrastructure based on their business demands – whether they in industrial manufacturing, automotive, telco or colocation environments. But how can users achieve this?

Vendor agnostic and flexible micro data centres are the future for the industry – an approach that builds containment systems around customers’ needs, without forcing their infrastructure to fit into boxes. Users should have the flexibility to utilise their choice of best in class data centre components including the IT stack, uninterruptible power supplies (UPS), cooling architecture, racks, cabling or fire suppression systems.

By taking an infrastructure agnostic approach it’s possible to give customers the ability to define their edge, and use standardised and scalable infrastructure in a way that’s truly beneficial to their businesses.

Editor’s comment

Growing data demands are forcing engineers to think creatively about the ways they design and develop data centres. Andy’s point about the rigidity of some micro data centre solutions is pertinent and one that needs to be addressed in order to fully meet the potential of the edge.

Company
Team

Subzero Engineering Appoints New UK/EMEA Channel Manager

Article Featured in Data Centre Network News

Subzero Engineering has announced it has appointed Craig Brown as its new UK/EMEA Channel Manager.

Craig brings with him a wealth of data centre industry and IT Channel expertise, having held a variety of sales, marketing, and management roles throughout his career. Over the past 25 years Craig has worked for some of the industry’s foremost infrastructure vendors, including APC, Anixter, Eaton, Geist and Vertiv, and has been appointed to support Subzero’s expansion strategy as it scales across the EMEA region.

Subzero Engineering’s impressive track record for technology innovation, engineering consultancy, data-driven design, and environmental impact services, combined with its dynamic expansion plans, were a major factor in Craig’s decision to join the company. The company has a strong and demonstrable track record of working with the hyperscale and colocation communities and supporting the digital transformation efforts of world-leading industrial manufacturers, retail, and fashion brands. Craig’s experience of working with the Internet Giants, and with customers of the financial and telco sectors, will be crucial to the company’s efforts.

In his new role, Craig will be responsible for scaling the company’s partner base, building on its engineering, structured cabling, heating, ventilation, and air conditioning (HVAC) partners to drive growth across the region. With technological expertise in thermal dynamics, the data centre powertrain and in white space technologies, he understands the critical role that M&E consultants play in the industry. Further, he will continue to develop the company’s new services offering, which utilises computational fluid dynamics (CFD) software to evaluate and analyse legacy systems and form a data-driven basis on which to build businesses digital transformation and modernisation efforts.

“The data centre industry is one of the world’s most important sectors, and the work of its mechanical and engineering (M&E) professionals is essential, as digitalisation efforts accelerate,” says Craig Brown, UK/EMEA Channel Manager, Subzero Engineering. “I believe our environmental impact services, and innovative approaches to vendor-agnostic data centre solutions, provides our partners with an opportunity to address end-user challenges, add considerable value, and drive long-term growth.”

“I want to build a Channel program which showcases Subzero’s world-class engineering capabilities, which are dynamically delivered and supported with a high-quality service and support package,” he continues. “Our primary ambition, however, is to develop a Channel program that is based on true partnerships, and one which will push the industry to better support its partners.”

“I’m delighted that Craig has joined the company as part of our European expansion plans, bringing the perfect blend of ambition, energy and experience to this key role” says Andy Connor, EMEA Channel Director, Subzero Engineering. “We believe that our technologies offer partners a major opportunity to deliver a true combination of flexibility, modularity, scalability, and sustainability, all of which are crucial to help customers on their digital transformation journeys.”

Data Center
Educational Article

Subzero Engineering: Sustainable Solutions for Data Centers

Article Featured in AI MagazineConsultancy and customised containment – which complement the data centres they work with – is the global calling card of Subzero Engineering

Subzero Engineering recognises data centers are dynamic environments, so they have created customised containment solutions which make energy-efficient savings for their customers.

Subzero Engineering is the industry leader in bespoke containment solutions using computational fluid dynamics (CFD) to show measurable results for their customers which includes the following savings; $300 million in energy costs, 1.5 billion gallons of water, and three million tonnes in the reduction of carbon dioxide since 2015.

“We believe that a data-driven approach is essential to drive data centre performance and efficiency,” commented Andy Connor, Director EMEA Channel, who points out they offer CFD checks for free.

“We help our customers do this with our customised, streamline, and energy efficient containment solutions which result in a lower total cost of ownership and reduced carbon emissions.”

Subzero Engineering has manufacturing facilities in Salt Lake City, US, where they were founded in 2005 (starting out as a data centre airflow consulting company), and in Dublin, Ireland.

“We have a large team of leading industry experts that help us operate globally, and at speed, and we work with customers ranging from the hyperscalers and colocation communities through to well-known brands and sports, retail, HPC, and AI,” said Connor.

Partnership with atNorth
Subzero Engineering has been working with atNorth, a high performance sustainable data centre in Iceland, for the past three years.

“When atNorth began the process of building their data centre halls they got in touch with us to provide the hot and cold aisle containment systems. Their facility is unique in its structure, so we moved from simply providing containment solutions to working with them consultatively to create a standardised ultra-efficient and performance focused system and something that could be repeated across multiple sites as their business grew.”

Climate neutral data centre pact
One of the drivers which is currently influencing data centre design is the fact hyperscalers and members of the colocation community have signed up to the climate neutral data centre pact.

“New data centres are being designed for sustainable operations, but it needs to be more flexible to accommodate the needs of GPUs chip and processing power, so there’s a real challenge to find that balance,” said Connor. “However, I think the real challenge in the market is the legacy facilities. These really need to be updated and modernised to become more efficient in order to reduce their OPEX, energy consumption, and CO2.”

Balance performance and efficiency
Connor says Subzero Engineering helps operators balance performance and efficiency. “We started life back in 2005 as a CFD consultancy when data centres were using raised floors and experiencing issues with leakages. Our software solution showed customers how they could analyse the infrastructure and improve efficiency.

“Fast forward 16 years and that approach has stayed with us. We’re an engineering-led solutions provider who helps businesses reduce their carbon footprint and operating costs – but it all starts with the data we produce from our CFD reports,” he said.

Data Center
Educational Article

The Role of Containment in Mission-Critical Edge Deployments

Article Featured in Data Centre & Network News
By Gordon Johnson, Senior CFD Engineer at Subzero Engineering

Today, edge data centers need to provide a highly efficient, resilient, dynamic, scalable and sustainable environment for critical IT applications.

Subzero Engineering, believes that containment has a vital role to play in addressing these requirements.

In recent years, edge computing has become one of the most prevalent topics of discussion within our industry. In many respects, the main purpose of edge data centers is to reduce latency and delays in transmitting data and to store critical IT applications securely. In other words, edge data centers store and process data and services as close to the end user as possible.

Edge is a term that’s also become synonymous with some of the world’s most cutting-edge technologies. Autonomous vehicles have often been discussed as one of the truest examples of the edge in action, where anything less than near real-time data processing and ultra-low latency could have fatal consequences for the user. There are also many mission-critical scenarios, including within retail, logistics and healthcare, where a typically high density computing environment, packed into a relatively small footprint and a high kW/rack load is housed within an edge environment.

Drivers at the edge

According to Gartner, by 2020, internet capable devices worldwide reached over 20 billion, and are expected to double by 2025.  It is also estimated that approximately 463 exabytes of data (1 exabyte is equivalent to 1 billion gigabytes) will be generated each day by people as of 2025, which equates to the same volume of data as 212,765,957 DVDs per day!

While the Internet of Things (IoT) was the initial driver of edge computing, especially for smart devices, these examples have been joined by content delivery networks, video streaming and remote monitoring services, with augmented and virtual reality software, expected to be another key use case. What’s more, transformational 5G connectivity has yet to have its predicted, major impact on the edge.

Clearly, there are significant benefits in decentralizing computing power away from a traditional data center and moving it closer to the point where data is generated and/or consumed. Right now, edge computing is still evolving but one thing we can say with certainty, is that the demand for local, near real-time computing represents a major shift in what types of services edge data centers will need to provide.

Efficiency and optimization remain key

An optimized edge data center environment is required to meet a long list of criteria, the first being reliability as edge facilities are often remote and have no on-site maintenance capabilities. Secondly, they require modularity and scalability, the ability to grow with demands. Thirdly, there’s the issue of a lack of a ‘true’ definition. Customers still need to define the edge in the context of their business requirements, deploying infrastructure in line with business demands, which can of course affect the design of their environment. And finally, speed of installation. For many end-users time to market is critical, so an edge data center often needs to be built and delivered on-site in a matter of weeks.

There is, however, one more important factor to consider. An edge data center should offer true flexibility, allowing the user to quickly adapt or capitalize on new business opportunities while offering sustainable and energy efficient performance.

Edge data centers are, in many respects, no different from traditional facilities when it comes to the twin imperatives of efficiency and sustainability. PUE as a measure of energy efficiency applies to the edge as much as to large, centralized facilities.

And sustainability, especially the drive towards net zero, is a major focus for the sector in its entirety. However, what will change over time is the ratio of edge data centers. By 2040, it’s predicted that 80% of total data center energy consumption will be from edge data centers, which begs an obvious question: what will make the edge energy efficient, environmentally responsible, reliable and sustainable all at the same time?

The role of containment

Containment is almost certainly the easiest way to increase efficiency in the data center. It also makes a data center environmentally conscious because, instead of consuming energy, containment saves it. This is especially true at the edge.

Containment helps users get the most out of an edge deployment because containment prevents cold supply from mixing with hot exhaust air. This allows supply temperatures at the server inlets to be increased.

Since today’s servers are recommended to operate at temperatures as high as 80.6 degrees Fahrenheit (27 degrees Celsius), containment allows for higher supply temperatures, less overall cooling, lower fan speeds, increased use of free cooling and reduced water consumption – all important factors when it comes to improving efficiency and reducing carbon footprint at the edge.

Further, a contained solution consumes less power than an application without it, which means an environmentally friendly, cost-effective environment. Additionally, it improves reliability, delivering longer Mean Time Between Failures (MTBF) for the IT equipment, as well as lower PUE.

Uncertainty demands flexibility

Subzero believes that an edge data center needs to be flexible and both quick and easy to install. It needs to be right-sized for the here and now, but capable of incremental, scalable growth. Further, it should allow the customer to specify the key components, such as the IT, storage, power and cooling solutions, without constraining them by size or vendor selection.

Thankfully, there are edge data center providers who now offer an enclosure built on-site in a matter of days, with ground-supported or ceiling-hung infrastructure to support ladder racks, cable trays, racks and cooling equipment.

These architectures mean the customer can choose their own power and cooling systems and, once the IT stack is on-site and the power is connected, the data center can be up and running in a matter of days.

Back in 2018, Gartner predicted that, by 2023, three-quarters of all enterprise-generated data would be created and processed outside a traditional, centralized data center. As more and more applications move from large, centralized data centers to small edge environments, Subzero anticipates that only a flexible, containerized architecture will offer end-users the perfect balance of efficiency, sustainability and performance.

Data Center
Success Story

atNorth, Subzero Standardize Approach to HPC Colocation

Article Featured in Data Centre Magazine

A case study by Subzero Engineering shows how leading Nordic data centre services firm atNorth was able to standardise its approach to HPC colocation

Summary

atNorth is a leading Nordic data centre services organization based in Reykjavik, Iceland. It offers environmentally responsible, power-efficient, and cost-optimized data centre hosting facilities, with the capabilities to deliver high-performance computing (HPC) services.

By working with Subzero Engineering, a leading provider of data centre containment solutions, the company was able to standardize its approach to HPC colocation; using a scalable, energy-efficient, and ultra-secure, fault-tolerant cold aisle containment (CAC) methodology to replicate its sustainability and performance capabilities across multiple sites.

Customer Background

atNorth is a leading Nordic data centre services company offering environmentally sustainable, power-efficient, and cost-optimized data centre hosting facilities. Its Tier III, redundant design and its innovative ability to support rack densities ranging from 40kW – 100kW make it the perfect partner for organizations using high-performance computing (HPC) to solve some of the world’s most challenging problems.

With operations in Stockholm, Sweden and Reykjavik, Iceland, the company’s mission is to offer more compute for a better world, leveraging innovative data centre designs, power efficiency, and intelligent clusters to support the disruptive technologies used by customers. This includes workloads that require High Performance Computing (HPC) infrastructure, such as simulations, scientific calculations, artificial intelligence (AI), deep learning, and blockchain applications.

At its Icelandic Thor DC and Mjölnir DC colocation campuses, the company continues to push the boundaries of Nordic data centres; using 100% renewable energy resources from hydropower and geothermal sources to power their facilities, which are optimized for ultra-energy efficiency, maximum reliability, and industry-leading performance.

With this approach that incorporates Direct Free Air cooling and carbon-free energy, atNorth delivers a Power Usage Effectiveness (PUE) rating below 1.2 at its Tier III Mjölnir DC. A strategy that offers customers a reduced total cost of ownership (TCO), increased operational and energy efficiency, and a secure, scalable data centre platform to protect the long-term lifecycle requirements of their infrastructure deployments.

Challenges

When designing its second 80MW, Mjölnir DC data centre campus in Reykjanesbaer, Iceland, atNorth was looking for a containment partner that was able to deliver to demanding timescales.

The company required a high-quality, robust, and secure containment solution that would offer the ability to standardize their design, while delivering repeatable performance, sustainability, and efficiency capabilities across multiple sites.

Further, due to its reputation for sustainable HPC and colocation, and for building long-term customer relationships, the company was looking to establish a new supply chain partner who could work with them as the company grew.

Proposed Solution

Working to meet the company’s requirements for speed, efficiency, and precision, Subzero Engineering quickly engaged with Jóhann Þór Jónsson, atNorth’s Director Project Management and Business Development. Rather than offer a simple proposal containing a product specification and cost, the companies’ engineers provided consultative expertise from a remote location in the USA, offering valuable insight that would help to future-proof the data centre and meet growing customer demand.

Once a relationship was established, Subzero specified its Essential Plus+ product line, offering a vendor-neutral, quick-to-deploy, and flexible containment system. Available globally, the Essential Plus+ products would provide atNorth with a standardized containment architecture, which would accommodate any customers’ HPC rack, server, or storage requirement.

“Subzero’s response time was exceptional,” said Jóhann Þór Jónsson, Director Project Management and Business Development, atNorth. “They not only specified a cold aisle containment architecture complete with security doors and top roofs, but worked with us consultatively to engineer a robust, clean, and energy-efficient system that would look visually impactful and fit with the site’s geothermal surroundings.”

Results

The sleek look and feel, best-in-class materials, and energy-efficient architecture of the Essential Plus+ products met atNorth’s requirements for a customizable, robust and high-quality containment solution. Moreover, it would enable them to standardize and quickly scale across new sites, using a methodology that delivers increased security, performance, and sustainability. This is a pivotal approach, and has informed the design, construction and development of its third climate-positive data centre in Stockholm.

“Subzero Engineering has given us a standardized, repeatable, and physically secure containment system, which fits well with our own philosophy,” said Jóhann Þór Jónsson, Director Project Management and Business Development, atNorth. “They have offered us a flexible containment solution, focused on both performance and efficiency, but which is easy to customize with the changing requirements of our intensive computing customers.”

Further, the synergies between the companies were clear from the outset, both having values ingrained with pushing the boundaries of performance, sustainability, and energy efficiency. Subzero’s containment solutions would not only contribute towards atNorth’s industry-leading low PUE, but their approach would deliver exceptional value: establishing them as a long-term partner for the company’s high performance, sustainable, colocation services.

“As a business, we’re always focused on the long-term objectives of our customers, and we choose to work with companies whose values are aligned with ours,” continued Jóhann Þór Jónsson, Director Project Management and Business Development, atNorth. “Subzero Engineering remained service-minded, agile, and worked to truly understand our business: providing a consultative, value-add and intricate data centre solution that meets our demands for performance and efficiency both now, and in the future.”

Data Center
Educational Article

Keeping the Edge Customer-Focused

Article Featured in Data Centre Review
By: Andy Connor, Director – EMEA Channel at Subzero Engineering

For many years, the data centre industry has been engaged in a deep discussion on the concept of edge computing. Yet the definition varies from vendor to vendor and from customer to customer, creating not only mass confusion, but a fixed mindset in terms of solutions design.

One might argue that through its lack of a true definition, the subjective nature of the edge has led the industry down an often singular path, where edge technologies have been designed to hypothetically meet the customers’ needs, but without the application in mind.

IDC defines the edge as the multiform space between physical endpoints such as sensors and the ‘core’, or the physical infrastructure – the servers, storage and compute – within cloud locations and data centres. Yet within more traditional or conservative sectors, some customers are yet to truly understand how the edge relates to them, meaning the discussion needs to change, and fast.

Defining the edge

When the trend of edge computing began to gain traction, the Infrastructure Masons were one of the first to try and define it. But even they recognised its largely subjective nature was beginning to cause market confusion, and stated that a widely accepted definition would become more essential as the industry began to confront the challenges that will arise at the edge.

What’s clear is that the business case for edge technologies is becoming more prevalent, and according to Gartner, “by 2022, more than 50% of enterprise-generated data will be created and processed outside the data centre or cloud.” All this data invariably needs a home and depending on the type of data that is stored, whether it’s business or mission-critical, the design and location of the infrastructure will undoubtedly need to vary.

One size fits all?

Today in our industry, there’s a very real danger that, when it comes to the edge, many end-users will be sold infrastructure defined by the manufacturer and not based on the customer’s needs. And that’s because edge solutions are often found in one size, type or variable form factor. This creates a market whereby potential customers are persuaded that ‘one size fits all’, and that’s a far cry from the modular and agile approach that the industry has turned towards in recent years.

The reality is that the edge has almost as many definitions as there are organisations trying to define it. And, while there are a range of well-defined and well-understood edge applications already in use such as micro data centres in retail locations, localised infrastructure providing low latency content delivery to avid viewers, there are many edge applications yet to be fully understood, defined or implemented.

Many existing edge applications remain unpredictable in terms of their data centre and IT resources. And often local infrastructure is required to support the continued roll-out of a service looking to scale.

In summary, most, if not all, organisations are faced with making frequent decisions about the best place to build, or access, edge infrastructure resources. And in today’s dynamic, digital world such decisions need to focus on the customer’s business requirements, providing them with a flexible, agile and optimised architecture that’s truly fit-for-purpose.

Finding flexible solutions

A standard-size container or micro data centre might be far too big for the business’ needs – but the assumption is that maybe the user will grow into it. And then there’s the question of customisation. What if the solution needs to be liquid-immersion cooling enabled for GPU-intensive computing at the edge? Not every micro data centre architecture can be built for that technology, and certainly not if the customer needs to scale quickly.

There’s a question of cost. Micro data centres in standard form factors, or pre-integrated systems, often contain CAPEX-intensive server and storage technologies from manufacturers defined by the vendor. This, again, is a far cry from a solution that is defined to meet the business needs.

In our industry, relationships are everything, and one must acknowledge that customers will want to specify power, cooling and IT infrastructure from their own choice of suppliers, and at a cost that meets their budgetary requirements.

At Subzero Engineering, we believe customers need a solution that supports their business criterion, and one that helps them capitalise on the emerging opportunities of the edge. What’s more, we believe that containerised edge data centres, which are optimised for the application, built ready to scale and vendor-neutral for any type of infrastructure, are those that can truly meet the needs of the end-user.

What’s clear is that with the advent of edge computing, the customer needs to define their edge. And as design and build consultants, our goal must be to support their needs with flexible, mission-critical solutions.