Thirteen years ago Nick Carr published a seminal article in Harvard Business Review arguing that IT doesn’t matter. As the thinking went, enterprise IT spent far too much time and money rebuilding the same applications and infrastructure that was essentially commoditized. The rise of cloud computing and open source seemed to confirm his suspicions, leading many to conclude that outsourcing was the right way to minimize investments in such commodity code.
Those people were wrong.
As Redmonk analyst James Governor correctly argued a year ago, “Cloud is of course itself a form of outsourcing, but one that allows for speed of delivery, and encourages reshoring of skills. But people and processes changes are needed to do the work.” Those people who make an ever-bigger impact on an enterprise’s ability to differentiate and compete are developers, and outsourcing them was one of the worst ideas ever conceived.
Getting rid of IT
Thirteen years ago, Carr suggested that enterprises didn’t need to bother with infrastructure (or, by extension, the people that fed it):
[T]he core functions of IT—data storage, data processing, and data transport—have become available and affordable to all. Their very power and presence have begun to transform them from potentially strategic resources into commodity factors of production. They are becoming costs of doing business that must be paid by all but provide distinction to none.
This led enterprises to outsource, with outsourcing dollars climbing year after year. Though the IT outsourcing market may be cooling off in 2016, it still topped a whopping $442 billion in 2015, climbing each year since Carr’s article was written. While he can’t take all the credit (or blame), he helped encourage the belief that hardware and software could be best managed as a cost of doing business, with outsourcing firms well-positioned to lower those costs.
Unfortunately, many enterprises took this to heart and outsourced with reckless abandon, despite warnings (including from IT outsourcing firms) that outsourcing software development, in particular, was foolhardy. Now the savvier companies are hitting their day of reckoning on outsourcing. At AWS re:Invent in 2015, GE took the stage with a bold confession: “Like many of you we outsourced way too much.”
Informing this opinion is the resurgent belief that IT matters. A lot.
Bringing the source back in
Yes, IT will often be “outsourced” to Amazon or another cloud provider, but this isn’t because enterprises hope to dump the bother of IT on a public cloud vendor. Quite the opposite. Instead, they’re turning to public cloud as a way to make their infrastructure more elastic and more flexible in the face of ever-changing demands.
AWS product strategy chief Matt Wood put it this way:
Those that go out and buy expensive infrastructure find that the problem scope and domain shift really quickly. By the time they get around to answering the original question, the business has moved on. You need an environment that is flexible and allows you to quickly respond to changing big data requirements.
That’s the hardware. But the people who are figuring out the ideal architecture and applications that feed into AWS, Microsoft Azure, or Google Cloud “hardware” are the most critical components. These are the developers, the “kingmakers” that Redmonk often talks about. They’re the ones driving the growing realization that software is eating the world, and they’re the ones who are gobbling up entire industries with code.
Open source code. Code running in the public cloud.
So, cancel those outsourcing contracts. Bring software development back in house, and have the people closest to your business choosing the software and hardware they build and run. IT is not a cost, it’s the thing that can separate your enterprise from the pack.