Parsing HP’s perspective on the Software-Defined Datacenter

This is Part 1 of a three-part post based on my understanding of the basics of the definition of the software-defined datacenter as HP sees it, and as espoused by Helen Tang, HP Converged Infrastructure Worldwide Solution Lead in this blog post here.

Part 2 follows almost immediately, and will be on my thoughts on her list of requirements for the SDDC journey, and Part 3 will be on what I think of what HP is publicly doing in order to enable this.

The 3-part series may be interspersed by a post on my reasons for focusing on HP.

In my post here, I asked about just what, software-defined is.

HP’s Calvin Zito (@HPStorageGuy), Social Media Strategist for HP Storage, responded with a link to his informative ChalkTalk video introduction to software-defined storage, and Chris Purcell (Chrispman01), Influencer Marketing Manager for HP Converged Systems chimed in with this very good breakdown.

For which I thank them both.

Last week, HP’s Converged Infrastructure Worldwide Solution Lead, Helen Tang, posted an excellent perspective on a software-defined datacenter. Titled While Software-Defined holds the promise of changing everything – you need to do your homework, her blog post is in simple, jargon-free English.

I found the article to be very interesting, since it deals not only with the software-defined concept as it relates to computing components, but it also takes a holistic view of a software-defined datacenter, an ideal which is a goal I am pursuing not only internally, but for client computing assets we manage.

I will be using aspects of the article as a guide, if you will, on my own software-defined journey.

In this blog series however, I will attempt to deconstruct her article, and add my thoughts, questions, and concerns to the mix.

While Software-Defined holds the promise of changing everything – you need to do your homework
A statement made in the afore-mentioned article is the fact that datacenters of the [near] future need a complete full lifecycle approach that is at once integrated, streamlined, automated, efficient, and simple.

That, concisely, is what I am sure everyone wants. The problem is getting there.

In simple terms, the vision of a ‘Software Defined Data Center’ – or SDDC – is where ‘control-enabled’ business applications will influence their own infrastructure based on business conditions; in concert with an application control plane that prioritizes all resource requests. The software defined environment is policy-based and controls virtually all aspects of the data center”

This is where issues start to crop in, in my opinion.

While I like the fact that we would be able to dynamically reconfigure datacenters based on changing business needs, this requires a level of automation and management across several datacenter components, from servers to networking to storage, and more.

I believe it requires that most dreaded of clichés, the single pane of glass management. Current SPOG management schemes are fraught with several issues, the most glaring of which is the constant need to drop from that supposedly all-seeing SPOG to a lower-level dedicated management tool for deep configuration of particular DC components.

It also requires levels of interoperability and interconnectedness within hardware, software, and management platforms and schemas that are currently unforeseen in the industry.

Would this be allowed to happen?

Stakeholders have traditionally not interoperated unless faced with marginalization or extinction. Or worse, irrelevance.

“….SDDC is a promising way to better align IT to the speed of your business with open [sic] choices regarding how best to consume and/or deliver IT to maximize business value and IT agility.”

This, is the Holy Grail for datacenter and indeed, computing management.

Helen then makes a very apropos segue into the rôle of hardware in this software-defined future. And in doing so, she lays down what I believe is a warning billboard:

“Be very, very careful when you hear anyone say that hardware is no longer relevant in our brave new software defined world.”

Caveat emptor, indeed.

In this seeming headlong rush into software-defined nirvana, I am quite leery of every Tom, Dick, and Harry startup that claims to have solved the issue. I am actually more nervous of the our industry stalwarts that position their current wares as software-defined without being able to tell me why it is just that.

So, what is HP doing about the Software-Defined Datacenter?

Well, HP has been on the software-defined beat since HP Tech Forum 2010 when their initiatives, under the Converged Infrastructure umbrella, was announced.

As a result of their prescient planning, they seem to have quite a lot of coherency about their SDDC strategy. According to Helen, HP’s goal to enable IT to optimize the rapid creation and delivery of business services, and doing so reliably, has not changed.

She then makes following three points:

#1 It’s about both Physical and Virtual
SDDC enables IT to optimize the rapid creation and delivery of business services, reliably, through policy-based automation, from the infrastructure up to the application using a unified view of physical and virtual resources – it is not wand waving and the creation of a magic realm. It is however, enabling application level controls for your entire data center, from infrastructure to operations and management, spanning physical and virtual.

The inference here is that the software-defined datacenter must be virtualized.

I am puzzled by this.

My assumption, prior to now, has been that while there would be an appreciable use of virtualization in the software-defined datacenter, or SDDC, such virtualization would be complementary to the use of ‘the physical’.

This point requires that I learn more.

#2 It’s about control at three different levels
We’re talking about control of all functions and resources in the data center from 3 perspectives: the application, the users, and the IT administrator. It’s going to be an integrated environment that weaves together control from these three angles, in a holistic, automated and streamlined fashion – this ensures dynamic, efficient control of IT services.

Bingo!

Management, and effective, efficient, granular management, at that, is required.

This cannot be shirked. If not, the goal of a software-defined datacenter would never be realized.

#3 It’s about Open choices
SDDC aligns business and IT like never before by providing open choices regarding how best to consume and/or deliver IT for maximum agility, security and business value. To be effective, this must be accomplished through an integrated abstraction layer, that is Open Source that does not lock you in to any single vendor’s vision. Now the industry has not completely defined this layer yet. HP and other IT industry key players are driving this – as a key piece to the SDDC puzzle. And we intend on making this SDDC abstraction layer rock solid and enterprise ready, even though it comes from open source.

I cannot disagree with this any more strenuously.

Although I agree completely that extremely granular and deep-reaching interoperability among all stakeholders in a datacenter will be required to make this goal a reality, I do not see why it has to be open source.

While I would like to see all parties in my SDDC future adhere to open standards, I fail to see the need to make that requirement an open source requirement.

One of the reason I am against it, is that I cannot shake the feeling that the use of open source components means that the ISV is either not up to the task of developing the required components in-house to open standards, or has lazily decided to offload the component development to a stereotypical image of open source developers I cannot shake.

For full disclosure, I am not a fan of open source. Not in the very least. The recent “Heartbleed” incident has not helped the open source cause for me, and has made me move more indelibly against it. It would take a momentous development to remove my steadfast distaste for it.

Moreover, I see it as absolving software developers of required responsibilities should issues arise, as the audit trail stops at the faceless community.

I have a problem basing my business, and any financially backed QoS to this.

What am I not seeing?

Part 2 follows shortly.

In a follow-up blog post, I will divulge why HP seems to be hogging most of my attention on this topic. I invite you to email comment here or email me at sddc0514@absolutevista.com.

The source article is While Software-Defined holds the promise of changing everything – you need to do your homework, and is on HP’s Converged Infrastructure Blog

© 2002 – 2014, John Obeto for Blackground Media Unlimited

1040-8