The Creators of the Resource Cloud Concept

Abiquo


Abiquo Authors: RealWire News Distribution, Maureen O'Gara, Tom Leyden, Liz McMillan, Elizabeth White

Related Topics: Datacenter Automation

Interview

Cloud Datacenters: 4 Questions for Uptime Institute

Digital Infrastructure VP Steve Carter Tracks Progress of Public & Private Clouds

With the global growth of Cloud Computing solutions and the datacenters that support them, it seemed like a good time to fire a few questions to the Uptime Institute, which leads the global conversation about datacenters through its certifications, consulting, research, and educational programs.

So here are four questions for Steve Carter, VP of Digital Infrastructure Services at the Institute..

1. How are datacenters becoming more efficient? What are the major strategies being used to maximize processing power while trying to keep cooling costs under control?

Steve Carter: There are two strategies that should be part of any good datacenter efficiency improvement effort.

The first is reducing IT's electrical load by improving utilization of IT systems on a per-server basis. The second is improving, that is, lowering, the amount of overhead power required for the mechanical and electrical systems that support the IT load.

Our Digital Infrastructure Services clients that currently average 30% virtualization across their distributed systems can realize a 2:1 payback on money spent for 3-year transformation projects that significantly reduce future IT total electrical loads

2. How important will latency and related issues be to datacenters? That is, what is the potential for datacenters to serve customers beyond their national and even continental borders?

Steve: Several large global companies have successfully consolidated datacenters in single geographical regions. Significant efforts were required to test and deploy application environments that are more tolerant to global latency issues.  Many legacy application environments must be replaced by web services type environments that allow global consolidations.

Consolidation of data centers allowed these global clients to reduce the total number of datacenter sites requiring global network connectivity.  The savings realized in reducing the number of datacenter network connectivity concentration sites allows for increasing bandwidth of the fewer numbers of circuits required.  Often the reduction of total quantities of the global circuits allowed these companies to dramatically increase bandwidth of the remaining circuits at a lower total global cost.

3. I've been guilty of equating "datacenter hosting" with "cloud computing," even though that's not always the case. What percentage of hosted datacenter services will be focused on cloud computing over the next few years?

Steve: Adoption of public, outsourced cloud services will be utilized at different rates by industry sectors and their maturity.  New upstart companies that do not have legacy infrastructures have very high percentages of public cloud deployments. On the other end of the spectrum, financial services organizations will be much slower to implement public cloud services.

I believe that public cloud adoption will follow trends that we observed for virtualization from 2006 till the present.  Areas such as application development environments were among the first environments to be virtualized in quantity.  I think we are seeing this trend develop for public cloud as well.

Applying cloud technologies within private datacenters is a trend that is gaining momentum. We have clients that are already in the development and test phases of transforming their client facing web services environments from traditional architectures & infrastructures to private cloud environments.

4. To what degree do economies of scale start to apply to datacenters? That is, even with so much offsite cloud computing, there will be local, company-owned datacenters for many more decades, I would assume. Most of these would be smaller than large, hosted plants, right? So how important are economies of scale, and what can companies do to ensure their local datacenter is as optimized and efficient as possible?

Steve: I believe that private datacenters can benefit significantly by utilizing the basic approaches utilized by datacenter service providers.

Service providers clearly understand their infrastructure CAPEX and OPEX costs for every square foot of space, every kW of power added and every BTU of cooling required.  This is not always true of private datacenter owners.  Understanding the true costs associated with any new added infrastructure requirement is necessary to effectively manage a datacenter at any scale.

Private datacenters greatly benefit by clearly understanding how every additional kW of IT load impacts their CAPEX and OPEX performance.  Services providers must clearly understand these basic financial facts if they are to remain in business.

More Stories By Roger Strukhoff

Roger Strukhoff (@IoT2040) is Executive Director of the Tau Institute for Global ICT Research, with offices in Illinois and Manila. He is Conference Chair of @CloudExpo & @ThingsExpo, and Editor of SYS-CON Media's CloudComputing BigData & IoT Journals. He holds a BA from Knox College & conducted MBA studies at CSU-East Bay.