Archive for the ‘Virtualization’ Category

Everybody talks about virtualization. The technology hypes and the doubters are deliberately ignored. But let us be honest: virtualization necessarily leads to new abstraction levels which in turn results in restraints in terms of handling. That’s why Gartner analyst Thomas Bittman noted some time ago that “virtualization without good management is more dangerous than not using virtualization in the first place.”

This is not about inventing a new discipline, as Forrester Research points out in a brand new report entitled “Managing the Virtual World is an Evolution, not a Revolution”: “Process discipline and streamlined management automation were already operational mandates, but the advent of virtualization on industry-standard servers exacerbates these requirements. Invest wisely in these technologies to avoid getting stranded with limited point solutions that offer little hope of integration into the broader initiatives for operational excellence.”

The doubters might even become more suspicious and ask themselves why this report stresses the common sense that system management cannot succeed with fragmented tools and without a holistic approach at the process level? And what does the distinction between EVOLUTION and REVOLUTION bring to the customer or the CIO dealing with the backlash of virtualization?
Reading the review of the Forrester report by Denise Dubie, former senior editor at Network World, the four listed key product categories for IT managers who want to control a virtual environment seem artificially separated.

Of course, there are 1) the provisioning part, 2) the capacity management part, 3) the performance part, and 4) the automation part, but the fact is that the essential question in virtual environments is not so much a complete list of all the disciplines as how these are interconnected. Because in enterprises reality, the provisioning issue is part of the performance issue, with capacity management as a prerequisite. Considering this, automation is not just another category. It is what glues all these parts together – not just in the virtual environment but across physical and virtual borders.

Between the lines the report proves again that automation is not just the answer to real-time challenges in dynamic markets. It is the only way to deal with complex interdependencies of hybrid and service oriented environments: “Many IT services, like virtualization management, are reaching a level of complexity where sophisticated mathematical algorithms and object models of the servers are more precise and efficient than even your most talented engineers”.

To learn more about Virtualization Automation view the UC4 Tour on this topic.


Read Full Post »

“Sometimes industry hypes can reciprocally enforce each other and sometimes they even coexist so closely that the question: “Who was first?” verges on the “chicken or the egg causality dilemma”. With “Virtualization” and “Cloud Computing” it’s different. Of course, they do “hype” each other but the concept of cloud computing is not even thinkable without having virtualization technologies in mind.”

That’s how I started a blogpost in October about an EMA White Paper on Virtualization Control through Intelligent Service Automation, written by former EMA analyst (now CA employee) Andi Mann. Notwithstanding this I still feel that many people compare apples and oranges when talking about virtualization and cloud computing.

Reason enough for me to share with you a definition by Gartner which should add some clarity. Gartner characterizes virtualization as

a disruptive technology in IT infrastructure and operations”, which has the potential to substantially reduce the cost of IT and support business to embrace change, “and enables new models(styles) of computing such as cloud computing where massively scalable IT-enabled capabilities are delivered as a service to external customers using internet technologies”.

This distinction between a disruptive “technology” on the one hand and the “mode” in which companies obtain IT services on the other hand should be very helpful because it can explain perfectly the conditions for hype evolution – when upcoming technologies and economic parameters (recession) shake hands.

Of course, technology-wise virtualization is not a lone fighter. This is a lesson companies have learned painfully. It is more a “quadruple jump” composed of integration, virtualization, management and automation.

Have you tried? We can help you!

Read Full Post »

Nowadays, enterprises are more and more often a result of mixing physical, virtual and cloud environments. And therefore a single point of management is a prerequisite for meeting SLAs and ensuring that business processes crossing platform, application and even physical borders are completed on time.
The funny thing is: as long as we lack visibility we are still thinking in terms of hurdles and obstacles. But at the moment we can manage physical, virtual and even cloud resources and applications from one pane of glass, we can outpace the disruptions and unify multiple jobs into one coherent process flow.

But even then we are not on target. Because at the very moment we achieve this coherence another effect appears – a boosting performance, made out of end-to-end visibility, seamless workload distribution and unprecedented processing power. Having a closer look at these columns of intelligent service automation one might be reminded of another concept – a connection I hit upon by Theo Priestley, an independent analyst and BPM visionary:

“Who remembers SETI@home, the project run by SETI to harness internet connected PC’s across the globe to help analyse signals from space? It was an early and successful attempt at mass distributed (or grid) computing using a small piece of software to use latent CPU cycles on client machines when the screensaver was engaged.

Now jump forward and the question is why hasn’t anyone taken this concept into the enterprise and into the BPM world itself? If you can imagine the many desktops that exist in an organisation sitting fairly idle when they could act as a BPM grid project to;

  • analyse, predict and act upon real-time data,
  • alter business rules on the fly,
  • creating intelligent workflow,
  • perform background simulation and CEP

Why bother with expensive server hardware (and future upgrades etc) when there’s potentially far more power sitting across the organisation not being fully utilised? Are there any examples of this in the BPM industry currently, if so would be good to hear about it.”

Yes Theo, there are examples – potential case studies are queuing up in front of our doors. It seems to me that we randomly adapted this GRID concept to the enterprise. Anyway, technologically we are ready.

Read Full Post »

“According to a recent survey conducted by Vanson Bourne on behalf of UC4 Software, almost half of the 300 IT managers surveyed are planning to develop a hybrid IT environment – a blend of physical, virtual and cloud infrastructure. And two-thirds of respondents consider managing workloads in a hybrid environment challenging or very challenging.”

Talking about automation today means talking about dynamically executed end-to-end automation. That’s why serious automation efforts nowadays must consider dependencies not just between standard applications, databases, platforms and legacy applications, but also between the virtual and the real world.

According to Ken Jackson, in an article on Data Center Knowledge, that’s where Intelligent Service Automation rings in the next phase in the evolution of workload automation: “According to the UC4 survey, eight in 10 respondents say automating between hybrid environments is important or very important. Increased confidence in automation between hybrid environments would increase the use of virtualization in 84 percent of cases and increase the use of cloud computing in 81 percent of the cases.”

Companies nowadays are grids of functions, people and relations focused on business processes – and dependencies wherever you look. That’s why you need a lot of intelligence to keep processes controllable and visible. Intelligence means management – which in turn is the key to bridging. This is not about night shift jobs anymore. This is not about batch modalities. This is about the future of the cloud AND the future of automation.

Read Full Post »

People like to think in “either-or”solutions, trying to make their lives easier, and maybe unconsciously trusting the paradox that more choices may lead to a poorer decision (Paradox of Choice). This is in spite of a reality which often proves to follow a more fuzzy and compromising “both-and” logic.

Take the hype about cloud computing. Although the world is full of cloud apologists nowadays one should bear in mind that the cloud market is still nascent – with, so far, only 4 percent of small and enterprise companies from the United States and Europe taking advantage of it, which means, from a management point of view, that in the nearer future we will have to deal with the reality of hybrid environments and even more processes connecting physical, virtual and/or cloud-based platforms.

Bob Muglia, president of Microsoft’s Server and Tools Business, proves that serious cloud providers like Microsoft share this view: “There’s more than one flavor of cloud computing, including private clouds that run on a business’s on-site servers. And it needn’t be an all-or-nothing proposition; we expect customers to want to integrate on-premises datacenters with an external cloud”.

This reality one needs to have in mind when it comes to evaluate the new “Agent for Web Services” UC4 unveiled some weeks ago. In a conversation I had with Vincent Stueger, Chief Technology Officer, UC4 Software last week, he told me why this Agent is “a really big and promising step. Because it’s much more than offering a simple web interface to remotely start a job or change a parameter. With this agent you can seamlessly monitor and control a process from the ground to the cloud and back.”

If Milind Govekar, Research Vice President, Gartner, is to be believed, this bridging capability will not just decide on the future of automation, but also on the future of the cloud: “The ability of the technology platform to manage a hybrid environment consisting of legacy and cloud applications with a single automation platform will be needed for clearing the way for greater adoption of cloud-based models.”

The cloud is not our destiny, but it brings a big choice – if we are able to provide the bridges.

Read Full Post »

When talking about automation, people easily ignore the power of change and consider the contemplated processes as engraved in stone. In spite of the fact that “change is not new and change is natural“, as Thomas L. Friedman (The World is Flat) pointed out in his thought-provoking book:“Change is hard. Change is hardest on those caught by surprise. Change is hardest on those who have difficulty changing too.”

Talking about change means talking about events – the secret currency of change counting any single change of state. This is worth emphasizing because events are not only the drivers of today’s businesses and operations, but they can occur everywhere – crossing platform, departmental and even enterprise borders.

Today you´re managing dynamic IT environments which are complex blends of physical, virtual, or cloud-based resources. In such environments transparency is key to staying agile and responsive. But even being reactive is not enough to keep your business situationally aware. To ensure that the processes are up-to-date and the engine is not automating errors and detours, any automation effort must be accompanied by an ongoing optimization effort.

The crux is that reaction and analysis are meshing. Take lunch break at school as real world example: the bell is ringing and 10 seconds later everyone stands in the line of the cafeteria waiting to be served. Following the classical monitoring way, cooking would start when the bell rings. Knowing more about the processes in the kitchen, the guys from UC4 start cooking 2 hours before – so everything is ready when the children come.

This kind of processing intelligence is key to avoiding overheads and running automated environments in a cost- and SLA-conscious way. Knowing the processes in school, the ringing bell is a foreseeable event. So you better not focus on reducing the reaction time and waste time and money. Otherwise it makes a lot of sense to monitor the cooking process as close to real-time as possible. It ensures that you have all the processing options available – before the bell rings!

Knowing that change is a constant not a variable and that automation can only be effective if it is combined with intelligence, UC4´s Application Assurance solution incorporates real-time data, insight into the complete end-to-end business or IT processes, and intelligent decision making.

Have a look. It’s worth it!

Read Full Post »

Of course, December is always the time for predictions, especially when we are going to enter a new decade. No wonder that December also marks a time when new buzzwords are created. One of these is “UC4”. Don’t laugh! Silicon republic predicts that “UC4 is set to dominate the CIO’s agenda 2010”! You can imagine how I stumbled first when I read this headline. 😉 But seriously, what does the new year hold – besides “Unified Communication, Collaboration and Contact Centre” (UC4)?

I will not contribute to this discussion with another buzzword. I just want to predict that it will probably be above all the big year for the user. This goes close with Brian Duckering who predicts for 2010 that also “management methods shift from system-based to user-based: Managing systems has always worked just fine. But it has gotten a lot more complicated and costly as users become more mobile and less predictable, demanding that their workspaces follow them from one device to another, seamlessly. For many this has caused a re-evaluation of what the purpose of IT actually is. The systems don’t create value for companies – the users do. Yet, the tools and methods predominantly deployed target devices, not people.”

Take a look on the still maturing virtualization market. Forrester predicts that server virtualization will grow from 10% in 2007, to 31% in 2008 to 54% in 2011. It’s a pretty impressive growth rate, of course. But actually nothing compared to the explosion the Gartner Group expects for the amount of virtualized PCs in the same period; it will more than centuplicate – from 5 Mio in 2007 to 660 Mio in 2011.

2010 will possibly be the year when the hyping technologies around virtualization will hit the front-end – where the user is waiting. This can also cause trouble – especially without transparent process management. Because “virtualization without good management is more dangerous than not using virtualization in the first place” – as Tom Bittmann, Gartner Analyst, already put it in a nutshell a year ago.

Hope you are ready for 2010!

Read Full Post »

Older Posts »