I’ve seen the future, and it runs in a software-defined environment – Part 1



 
 

During the latter half of my career I’ve spent a lot of time working with disruptive application technologies, so I know firsthand just how dynamic and unpredictable new business workloads can be from the perspective of infrastructure utilization. Yet, IT staffs are mainly trying to support this new breed of applications with data center technologies, processes and procedures that were originally developed to manage highly repetitive and predictable sequential transactions. The tension between twenty-first-century workloads and twentieth-century IT is almost palpable, and the answer, according to some, will be something called the software-defined environment (SDE).

IBM Fellow Jeff Frey

IBM Fellow Jeff Frey talking SDE

Being an inquisitive IBMer (is there any other kind?), I wanted to better understand our SDE strategy. After my searches turned up very little formal information—mainly this brief article on the IBM PartnerWorld website and a short YouTube video—I decided to pay a visit to my good friend and colleague Jeff Frey.

Jeff is an IBM Fellow and Chief Technology Officer of the System z platform. His fingerprints are all over every major advance in mainframe technology for the past 30 years, so I had a feeling that he’d be able to fill my knowledge gap. I was not disappointed!

So what’s the problem?

According to Jeff, even in our current age of massive virtualization—where the physical resources of a computer system can be stretched to support hundreds, or even thousands, of virtual images—those virtual resources are still allocated to workloads on a largely manual basis. Does your web site need new web server instances to handle unexpected demand? Those virtual servers have to be explicitly defined, provisioned and assigned. Today’s automation tools can help, but they don’t fully solve the problem. Data centers really need increased levels of intelligent automation and optimization that leverages virtualization as a foundation and can provide increased levels of operational efficiency, flexibility and responsiveness as well as lower costs.

What is SDE and how does it help?

Jeff believes the answer lies in expressing the virtualized resources of an IT infrastructure as “software.” When you are able to represent hardware as software, you make the entire IT ecosystem programmable, dynamic and subject to far higher degrees of management automation than can currently be achieved.

He says that today most people view virtualization solely as a means to drive higher levels of utilization of their physical resources. We need to take the concept of virtualization a step further and treat it as a mechanism for gaining full operational control over the data center by making the infrastructure programmable.

The goal of SDE is to do just that. Jeff’s view is that SDE provides the means by which compute, storage and network resources are expressed as software and thereby made programmable. A programmable infrastructure enables these precious resources to be put in catalogs, commissioned and decommissioned, repurposed, and repositioned automatically, and it allows them to be extended with “intelligence” that can provide extended levels of resource optimization. It opens the door to unprecedented levels of agility, efficiency and alignment with service level objectives.

Now I have yet to find a data center that is entirely standardized on a single virtualization technology. Looking at compute virtualization alone, one is likely to encounter some mix of IBM PowerVM, z/VM, KVM, VMware ESX server, Microsoft Hyper-V and perhaps a handful of other technologies in any given data center. How is it possible to create a programming layer across such a diverse set of server virtualization technologies?

The answer, according to Jeff, comes from the OpenStack project. The IT industry, including IBM, is settling on OpenStack as the software-defined infrastructure (SDI) for SDE. The OpenStack SDI spans all compute virtual environments and will enable the federation of heterogeneous compute, storage and network environments into a single, programmable infrastructure.

In some ways this is not a new idea; it is the formalization of strategies that a number of industry leaders have been pursuing. For example, VMware’s vision for SDE is being marketed as the Software-Defined Data Center. However, the “data center” in VMware’s strategy is strictly built around x86 resources. Is x86 the only architecture in your data center?

IBM takes a broader and more inclusive view of SDE, driving SDE technology into all of its enterprise systems (such as IBM zEnterprise System), expert integrated systems (such as IBM PureApplication System) and modular and blade (such as IBM Flex System) to deliver an SDE experience that more closely matches the realities of today’s data centers.

How does SDE relate to cloud computing? Stay tuned—I’ll cover that in Part 2. In the meantime, I’ll look forward to hearing your thoughts. Leave your comments below or connect with me on Twitter.


Paul DiMarzio has over 25 years of experience with IBM focused on bringing new and emerging technologies to the mainframe. He is currently part of the System z Growth business line, with specific focus on cross-industry business analytics offerings and the mainframe strategy for the insurance industry. You can reach Paul on Twitter: @PaulD360

Redbooks Thought Leader

 
 
Smarter Computing Analyst Paper - HurwitzTo effectively compete in today’s changing world, it is essential that companies leverage innovative technology to differentiate from competitors. Learn how you can do that and more in the Smarter Computing Analyst Paper from Hurwitz and Associates.

Subscribe to the Smarter Computing Blog

Recent Posts

A new model for hardware innovation

Doug Balog

CIOs have depended on Moore’s Law as a way to project their IT budgets, essentially relying on twice the compute for the same dollar year after year. IBM and the members of the OpenPOWER Foundation are leading the challenge to extend the promise that Moore’s Law could not fulfill: offering end-to-end system innovation through a robust collaboration model.

Continue reading

Why infrastructure can optimize business outcomes—or hold you back

Chuck Calio

Today we are in the middle of a massive and exciting transition from our traditional enterprise IT “systems of record” environments to the next generation of cloud, analytics, mobile and social “systems of engagement.” A huge amount of global innovation is the “rocket fuel” accelerating this growth, and the impact is being felt globally across both consumers and vertical IT industry players.

Continue reading

Leave a Reply

Your email address will not be published. Required fields are marked *

* Copy This Password *

* Type Or Paste Password Here *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>