The scale of corporate IT infrastructure has increased dramatically over the past decade and a half. At many companies, it has moved from basements with a few dozen servers to sophisticated data centers with thousands or tens of thousands of them. Networked storage hardly existed in the early ’90s but today consumes tens of millions of dollars in large IT organizations.

There are good reasons for this expansion. Infrastructure runs the applications that process transactions, handles the customer data that yield market insights, and supports the analytical tools that help executives and managers make and communicate the decisions shaping complex organizations. In fact, infrastructure has made possible much of the corporate growth and rising productivity of recent years.

Yet the very ubiquity of these computing, storage, and networking technologies makes some executives regard IT infrastructure as a commodity. That’s a mistake. Yes, components such as servers and storage—even some support processes, like the monitoring of applications—have been commoditized. Even so, an effective infrastructure operation creates value by making sound choices about which technologies to use and how to integrate them. A technology product purchased from a vendor may be a commodity, but the ability to bring together hardware, software, and support to provide the right combination of cost, resiliency, and features for a new application isn’t.

Especially now, when every expenditure and budget item receives careful scrutiny, infrastructure leaders must engage with business executives and application developers to expose potential sources of value, agree on priorities, and measure not only the cost but also the impact of infrastructure.

Comment