Time for a big blackout?

Data centers account for a growing share of global energy consumption and emissions. An average user PC consumes 1.3 kWh of electricity every three hours, without ever connecting; The transmission of one million static web page requests per second has been estimated at another 11,610 kilowatt hours (kWh), or enough to power 13 American homes for a month.

According to Peter Hewkin, founder of the British edge computing company Smart Edge Data Centers (SEDC), Many countries in peak hours are already facing limited capacity, with the next wave of data-hungry digital growth on the way. But he also estimates that at least 85% of the data stored by UK limited companies, and thus 85% of the data now kept in conventional data centers or server rooms, could be turned off when not. are needed and kept in “hot” storage.

“The accounts for the current year should be accessible all the time, but not for the remaining six years,” he says. “If a data center currently consumes 1 MW per hour, it may only take 150 kW per hour to support critical data.”

Consuming 1MW of power 24 hours a day at £ 80 per MWh could be around £ 700,000 a year. With hot storage making data accessible only on business days (excluding holidays) or 252 days per year, savings are already up to 30%.

If data access can be restricted to two designated hours Monday through Friday, that works out to just 506 hours a year, less than six percent of the current global standard of 8,760 hours. “About 15% of the data should be available 24 hours a day, seven days a week, but we still only got to 1,820 hours,” Hewkin says, adding that better prices could be negotiated with providers to stay out of “red” during peak hours, for example after 4pm and before 8pm. “This represents almost 80% savings in total consumption and CO2 emissions “.

For green and orange band prices, he thinks a price of around £ 60 / MWh could be realistic, with even £ 20 / MWh potentially achievable if only two hours a day are required at the green band price.

“Using 150kW 24×7 at £ 80 / MWh, our annual electricity charge will be around £ 105,120. The remaining 850 kW is only used for 506 hours per year at £ 60 / MWh, which is equivalent to £ 25,806. The annual total is £ 130,926 compared to £ 700,800 currently. This represents a saving of 81%, ”he says.

Electrical consumtion

John Booth, managing director of consultancy Carbon3IT, estimates that data centers, including colocation facilities, account for at least 12% of UK electricity consumption, or 41.11 TWh per year.

Cisco has previously forecast that IP traffic in the global cloud will exceed 14.1 zettabytes (ZB) by the end of 2020; Seagate sponsored by IDC Data Age 2025 Report projects overall data growth of 30% per year to reach 175 ZB, with data stored at 7.5 ZB, down from 1.1 ZB in 2019. Hyperscaler growth is expected to continue at a CAGR of 2% annually through 2025, according to ResearchAnd Markets.

Data center efficiency: a work in progress

Sustainability innovations of all kinds continue to increase efficiency in data centers, but clearly there is still room for other approaches.

French researchers Issam Raïs, Anne-Cécile Orgerie and Martin Quinson, writing in a peer-reviewed journal Practice and experience of concurrency and computing, quantified the likely impact of various shutdown techniques on the data center.

Operators have often been reluctant to reduce their number of powered servers due to concerns about reactivity and hardware failures, and misjudgments about power gain, they suggest.

The team simulated various production infrastructures and machine configurations under different shutdown policies and workload predictions. These include actual server shutdown and hibernation modes, and suspend-to-disk and suspend-to-RAM techniques, heterogeneous processing, and startup costs in time and power, as well as server life, as well as power evaluation. conscious algorithms (in related works).

“Shutdown techniques can save at least 84% of the energy that would otherwise be wasted on idle nodes. This remains true for prospective power proportional hardware and even aggressive shutdown policies do not affect the life of the hardware, ”they write.

SEDC’s Hewkin notes that some forecasts still estimate that data center power consumption could eat up 20% of UK generation in the next few years if current trends continue. However, most of the data generated is only used a few times, sometimes just once, and then it is stored forever “just in case”, like bank accounts or business accounts of a company. Banks have already started moving older consumer data to inaccessible “cold” storage.

Warm data centers

Additionally, he anticipates that “warm” data centers could cost half as much to build as traditional data centers, with less plant equipment and no backup generators. Existing data centers could be reused, Hewkin concludes, for emerging data-driven use cases, from automation to artificial intelligence.

New technologies and strategies in hardware, media, software, and architecture can continue to reduce data traffic and storage requirements.

Chief among these are software solutions, including active archiving, that control access to data, as a big challenge for a large server-level shutdown has been how to efficiently and dynamically provision resources.

On-demand workloads are highly variable, without a well-established taxonomy defining them as they traverse the cloud. Internal and internal migration of virtual resources can be a problem even as IaaS, PaaS, and containerization continue to advance alongside prediction methods such as regression techniques or time series.

Tony Lock, Distinguished Analyst at Freeform Dynamics, agrees that suitable software solutions are beginning to emerge. The approaches will likely select and combine the mix of available and increasingly cost-effective media types, from solid state disk (SSD) to tape, disk, and “perhaps even non-volatile memory in the future.”

However, if data centers are going to start taking customer data offline, they better make sure those customers don’t want to use or even look at it, he says. A big disconnect “sounds wonderfully simple in theory, the problem is that in practice it’s an absolute nightmare.”

The entire data accessibility area is littered with traps that are likely to cause massive friction between customers, IT providers, in-house IT teams, and data center companies.

“It is really difficult to classify that data. Yes, you can review it and maybe try to see when it was last accessed, ”Lock says. “Let’s say it was some time ago, so maybe you can move it to a cooler, cheaper form of storage, or eventually change it to something that doesn’t use electricity at all until you want to restore it.”

An investment in data discovery is needed

An investment in data discovery and analysis is required, which obviously takes time, effort and resources, and will likely be ongoing. Only after analysis should data be moved offline to a cold, or even hot storage alternative, according to specific business needs and requirements.

“You have to be absolutely sure that the user of that data will not look for it very quickly,” says Lock.

“Unless you can get started with some sophisticated form of quasi-file software, not only know what data they have, but understand how valuable it is to the business and what the policies are involved in doing something to move it elsewhere, achieve that’s complicated “.

But Lock points to emerging developments in autonomous data management software for multiple cloud and storage platforms by many “very large” incumbents and “extremely well-funded” new entrants, as well as vendor neutral Active Archive Alliance. (AAA).

AAA promotes automated policy-based migration that frees up primary storage as data ages, while focusing on actively archived data that remains recoverable and searchable, somehow avoiding the issue of changing people’s expectations about the availability of data.

He believes that at least 60% of the projected 7.5 ZB stored by 2025 do not need to be at higher performance and costly tiers and can be archived after 90-120 days of low activity, unchanged and rarely overwritten.

“Not all of this data needs to be archived. But much of it should be, thanks to strong retention policies and the newfound ability to derive value from archived data, ”states a AAA report.

Take the user into account

Lock notes that end users will typically complain when out of the box data is removed. This is perfectly understandable in terms of behavioral science – it’s their data, so they want access and control, when and how they want.

Businesses and users, policies and policies, must first come to terms with the idea that data is not “always available.” Providers need to examine how customers and internal business users will react – have the policies actually been explained to users? Or will the company let the IT department take the blame for any lack of transparency?

“You have to have the right technology, but you have to have the right people and policies so that the company is willing to do something about it,” Lock says. “And we’ve been talking about data ownership for a long time. Businesses want to be able to see their data. However, they don’t really want to do anything about it, unless it’s really easy, or preferably invisible, for them. “

SEDC’s Hewkin agrees: “The industry has yet to adapt to data that is not available 24×7. Everything we need to know is supposed to always be available at the touch of a button or click of the mouse. “

Add Comment