Feeds:
Posts
Comments

Everybody talks about virtualization. The technology hypes and the doubters are deliberately ignored. But let us be honest: virtualization necessarily leads to new abstraction levels which in turn results in restraints in terms of handling. That’s why Gartner analyst Thomas Bittman noted some time ago that “virtualization without good management is more dangerous than not using virtualization in the first place.”

This is not about inventing a new discipline, as Forrester Research points out in a brand new report entitled “Managing the Virtual World is an Evolution, not a Revolution”: “Process discipline and streamlined management automation were already operational mandates, but the advent of virtualization on industry-standard servers exacerbates these requirements. Invest wisely in these technologies to avoid getting stranded with limited point solutions that offer little hope of integration into the broader initiatives for operational excellence.”

The doubters might even become more suspicious and ask themselves why this report stresses the common sense that system management cannot succeed with fragmented tools and without a holistic approach at the process level? And what does the distinction between EVOLUTION and REVOLUTION bring to the customer or the CIO dealing with the backlash of virtualization?
Reading the review of the Forrester report by Denise Dubie, former senior editor at Network World, the four listed key product categories for IT managers who want to control a virtual environment seem artificially separated.

Of course, there are 1) the provisioning part, 2) the capacity management part, 3) the performance part, and 4) the automation part, but the fact is that the essential question in virtual environments is not so much a complete list of all the disciplines as how these are interconnected. Because in enterprises reality, the provisioning issue is part of the performance issue, with capacity management as a prerequisite. Considering this, automation is not just another category. It is what glues all these parts together – not just in the virtual environment but across physical and virtual borders.

Between the lines the report proves again that automation is not just the answer to real-time challenges in dynamic markets. It is the only way to deal with complex interdependencies of hybrid and service oriented environments: “Many IT services, like virtualization management, are reaching a level of complexity where sophisticated mathematical algorithms and object models of the servers are more precise and efficient than even your most talented engineers”.

To learn more about Virtualization Automation view the UC4 Tour on this topic.

“Sometimes industry hypes can reciprocally enforce each other and sometimes they even coexist so closely that the question: “Who was first?” verges on the “chicken or the egg causality dilemma”. With “Virtualization” and “Cloud Computing” it’s different. Of course, they do “hype” each other but the concept of cloud computing is not even thinkable without having virtualization technologies in mind.”

That’s how I started a blogpost in October about an EMA White Paper on Virtualization Control through Intelligent Service Automation, written by former EMA analyst (now CA employee) Andi Mann. Notwithstanding this I still feel that many people compare apples and oranges when talking about virtualization and cloud computing.

Reason enough for me to share with you a definition by Gartner which should add some clarity. Gartner characterizes virtualization as

a disruptive technology in IT infrastructure and operations”, which has the potential to substantially reduce the cost of IT and support business to embrace change, “and enables new models(styles) of computing such as cloud computing where massively scalable IT-enabled capabilities are delivered as a service to external customers using internet technologies”.

This distinction between a disruptive “technology” on the one hand and the “mode” in which companies obtain IT services on the other hand should be very helpful because it can explain perfectly the conditions for hype evolution – when upcoming technologies and economic parameters (recession) shake hands.

Of course, technology-wise virtualization is not a lone fighter. This is a lesson companies have learned painfully. It is more a “quadruple jump” composed of integration, virtualization, management and automation.

Have you tried? We can help you!

After 5 years of being the top CIO priority Business Intelligence has dropped in Gartner’s 2010 Executive Program survey to fifth place. Reason enough for Gartner analyst Mark McDonald to have a closer look and find out what has happened behind the curtains. And his conclusion is surprising and reassuring at the same time.

The good news is that BI is neither sick nor dead. The bad news for classical IT admins who still refuse to rack their brains with business related topics is that BI as technology is silently replaced by “Business Intelligence as a management capability” – which precisely reflects the shifting role of IT in general and that businesses nowadays need more than just technology to create value in complex, business driven environments.

“Creating a business intelligence capability demands a broader set of people, processes and tools that work together to raise intelligence, analytics and business performance … The move from technology to capability is about time as most organizations have already progressed through the peak of their CAPEX curve and moved from buying solutions to applying solutions.”

Nowadays, it seems more appropriate to talk about the “intelligent business” than “business intelligence”. Forget about the cube specialists in their ivory towers. Business Intelligence has left behind pure theory, while intelligent businesses have already arrived at the front-end fostering the seamless interaction between people, tools and processes.

That’s also why Josh Gingold and Scott Lowe from ZDNet can entitle their Live Webcast on the 20th of April: “The new ROI: Return on Intelligence” – proving that BI may have lost its position as a top technology priority according to the Gartner survey – but is increasing its importance to enterprise performance: “However, what many are now beginning to realize is that the return on an effective BI solution can actually exceed the cost of their entire IT budgets.”

The BI tools of UC4 are tightly coupled to process performance.  That’s our way to measure this new kind of ROI.
We invite you to have a look!

Nowadays, enterprises are more and more often a result of mixing physical, virtual and cloud environments. And therefore a single point of management is a prerequisite for meeting SLAs and ensuring that business processes crossing platform, application and even physical borders are completed on time.
The funny thing is: as long as we lack visibility we are still thinking in terms of hurdles and obstacles. But at the moment we can manage physical, virtual and even cloud resources and applications from one pane of glass, we can outpace the disruptions and unify multiple jobs into one coherent process flow.

But even then we are not on target. Because at the very moment we achieve this coherence another effect appears – a boosting performance, made out of end-to-end visibility, seamless workload distribution and unprecedented processing power. Having a closer look at these columns of intelligent service automation one might be reminded of another concept – a connection I hit upon by Theo Priestley, an independent analyst and BPM visionary:

“Who remembers SETI@home, the project run by SETI to harness internet connected PC’s across the globe to help analyse signals from space? It was an early and successful attempt at mass distributed (or grid) computing using a small piece of software to use latent CPU cycles on client machines when the screensaver was engaged.

Now jump forward and the question is why hasn’t anyone taken this concept into the enterprise and into the BPM world itself? If you can imagine the many desktops that exist in an organisation sitting fairly idle when they could act as a BPM grid project to;

  • analyse, predict and act upon real-time data,
  • alter business rules on the fly,
  • creating intelligent workflow,
  • perform background simulation and CEP

Why bother with expensive server hardware (and future upgrades etc) when there’s potentially far more power sitting across the organisation not being fully utilised? Are there any examples of this in the BPM industry currently, if so would be good to hear about it.”

Yes Theo, there are examples – potential case studies are queuing up in front of our doors. It seems to me that we randomly adapted this GRID concept to the enterprise. Anyway, technologically we are ready.

“According to a recent survey conducted by Vanson Bourne on behalf of UC4 Software, almost half of the 300 IT managers surveyed are planning to develop a hybrid IT environment – a blend of physical, virtual and cloud infrastructure. And two-thirds of respondents consider managing workloads in a hybrid environment challenging or very challenging.”

Talking about automation today means talking about dynamically executed end-to-end automation. That’s why serious automation efforts nowadays must consider dependencies not just between standard applications, databases, platforms and legacy applications, but also between the virtual and the real world.

According to Ken Jackson, in an article on Data Center Knowledge, that’s where Intelligent Service Automation rings in the next phase in the evolution of workload automation: “According to the UC4 survey, eight in 10 respondents say automating between hybrid environments is important or very important. Increased confidence in automation between hybrid environments would increase the use of virtualization in 84 percent of cases and increase the use of cloud computing in 81 percent of the cases.”

Companies nowadays are grids of functions, people and relations focused on business processes – and dependencies wherever you look. That’s why you need a lot of intelligence to keep processes controllable and visible. Intelligence means management – which in turn is the key to bridging. This is not about night shift jobs anymore. This is not about batch modalities. This is about the future of the cloud AND the future of automation.

People like to think in “either-or”solutions, trying to make their lives easier, and maybe unconsciously trusting the paradox that more choices may lead to a poorer decision (Paradox of Choice). This is in spite of a reality which often proves to follow a more fuzzy and compromising “both-and” logic.

Take the hype about cloud computing. Although the world is full of cloud apologists nowadays one should bear in mind that the cloud market is still nascent – with, so far, only 4 percent of small and enterprise companies from the United States and Europe taking advantage of it, which means, from a management point of view, that in the nearer future we will have to deal with the reality of hybrid environments and even more processes connecting physical, virtual and/or cloud-based platforms.

Bob Muglia, president of Microsoft’s Server and Tools Business, proves that serious cloud providers like Microsoft share this view: “There’s more than one flavor of cloud computing, including private clouds that run on a business’s on-site servers. And it needn’t be an all-or-nothing proposition; we expect customers to want to integrate on-premises datacenters with an external cloud”.

This reality one needs to have in mind when it comes to evaluate the new “Agent for Web Services” UC4 unveiled some weeks ago. In a conversation I had with Vincent Stueger, Chief Technology Officer, UC4 Software last week, he told me why this Agent is “a really big and promising step. Because it’s much more than offering a simple web interface to remotely start a job or change a parameter. With this agent you can seamlessly monitor and control a process from the ground to the cloud and back.”

If Milind Govekar, Research Vice President, Gartner, is to be believed, this bridging capability will not just decide on the future of automation, but also on the future of the cloud: “The ability of the technology platform to manage a hybrid environment consisting of legacy and cloud applications with a single automation platform will be needed for clearing the way for greater adoption of cloud-based models.”

The cloud is not our destiny, but it brings a big choice – if we are able to provide the bridges.

Don’t worry, I don’t want to open a new chapter in the “chicken or the egg” causality dilemma. But when I stumbled upon an argument by Bernard Golden – the author of the famous book: Virtualization for Dummies – I was briefly reminded of a dead-end street called “Business/IT alignment” we walked down some months ago.

Forget the wall separating IT and business. There is no such thing. It’s not true anymore that business pre-exists and IT is just a representation of what is happening on the business side. Therefore any “paving the cow paths” approach of computing will be a long shot, as Golden emphasizes in his brilliant article:

“In the past, IT was used to automate repeatable business processes — taking something that already exists and computerizing it. The archetype for this kind of transformation is ERP — the automation of ordering, billing, and inventory tracking. That “paving the cow paths” approach to computing is changing. Today, businesses are delivering new services infused and made possible by IT — in other words, creating new offerings that could not exist without IT capabilities.”

What Golden describes here is the end of the reaction model of IT as we know it – that someone acts and IT reacts. It’s not a pulling approach anymore it’s a pushing approach – where location-sensitive devices and mashed-up applications interact with each other as part of data driven processes.

Dynamization changes not only the application architecture, but also the requirements service-aware automation technologies must meet. In these highly variable environments applications will – according to Golden – need to “dynamically scale” “to gracefully and dynamically add new data streams as inputs” and “to rapidly shift context and data sets.

Does this new variability sound familiar to you? No wonder, it’s the linchpin of UC4’s Intelligent Service Automation. Workloads need to be distributed dynamically and out of the process because events are the heartbeats of modern applications. That’s why workloads neither adhere to system borders nor to business hours. They just don’t care.