Standardization Drives Innovation
15th June 2012 · 0 Comments
Diversity may be a wonderful thing for our culture, but the sheer diversity of data center parts and systems causes IT managers nothing but headaches. Today’s massive data centers house thousands of different servers, motherboards, components and accessories, each with slightly varying design specs — and each compatible with a limited number of other parts. As you might imagine, the support of such systems can be nightmarishly complex and labor intensive — and that effort could be spent on creating better systems instead. What’s the answer? Standardization.
That’s one of the primary goals of the Open Compute Foundation, which LGE is proud to work with as virtual CFO. Standardized hardware based on open design standards solves the problem of having to buy one particular part from one particular manufacturer. The intent is to reposition the focus on key value-added innovation instead of the pursuit of redundancy. After all, coming up with a dozen slightly different layouts for the same old motherboard doesn’t really give you much in the way of value. But what if you could take all the time and labor you’re putting into designing minute structural changes and put it into creating new components and systems that offered greater computing power with less energy expenditure? Suddenly you’re innovating instead of duplicating — and that’s the direction we believe standardization can take the IT industry.
What’s fueling the shift toward standardization? Not the manufacturers and suppliers, who want to keep making proprietary parts that their customers have to keep buying. The push is coming from buyers who don’t want to have to purchase and support multiple types of the same basic part if there’s a smarter, more efficient solution. An open design standard opens the door to cost-effective operation as well as innovation — and both are good for business.