Archive for the ‘Analyst Research’ Category

Everybody talks about virtualization. The technology hypes and the doubters are deliberately ignored. But let us be honest: virtualization necessarily leads to new abstraction levels which in turn results in restraints in terms of handling. That’s why Gartner analyst Thomas Bittman noted some time ago that “virtualization without good management is more dangerous than not using virtualization in the first place.”

This is not about inventing a new discipline, as Forrester Research points out in a brand new report entitled “Managing the Virtual World is an Evolution, not a Revolution”: “Process discipline and streamlined management automation were already operational mandates, but the advent of virtualization on industry-standard servers exacerbates these requirements. Invest wisely in these technologies to avoid getting stranded with limited point solutions that offer little hope of integration into the broader initiatives for operational excellence.”

The doubters might even become more suspicious and ask themselves why this report stresses the common sense that system management cannot succeed with fragmented tools and without a holistic approach at the process level? And what does the distinction between EVOLUTION and REVOLUTION bring to the customer or the CIO dealing with the backlash of virtualization?
Reading the review of the Forrester report by Denise Dubie, former senior editor at Network World, the four listed key product categories for IT managers who want to control a virtual environment seem artificially separated.

Of course, there are 1) the provisioning part, 2) the capacity management part, 3) the performance part, and 4) the automation part, but the fact is that the essential question in virtual environments is not so much a complete list of all the disciplines as how these are interconnected. Because in enterprises reality, the provisioning issue is part of the performance issue, with capacity management as a prerequisite. Considering this, automation is not just another category. It is what glues all these parts together – not just in the virtual environment but across physical and virtual borders.

Between the lines the report proves again that automation is not just the answer to real-time challenges in dynamic markets. It is the only way to deal with complex interdependencies of hybrid and service oriented environments: “Many IT services, like virtualization management, are reaching a level of complexity where sophisticated mathematical algorithms and object models of the servers are more precise and efficient than even your most talented engineers”.

To learn more about Virtualization Automation view the UC4 Tour on this topic.


Read Full Post »

“Sometimes industry hypes can reciprocally enforce each other and sometimes they even coexist so closely that the question: “Who was first?” verges on the “chicken or the egg causality dilemma”. With “Virtualization” and “Cloud Computing” it’s different. Of course, they do “hype” each other but the concept of cloud computing is not even thinkable without having virtualization technologies in mind.”

That’s how I started a blogpost in October about an EMA White Paper on Virtualization Control through Intelligent Service Automation, written by former EMA analyst (now CA employee) Andi Mann. Notwithstanding this I still feel that many people compare apples and oranges when talking about virtualization and cloud computing.

Reason enough for me to share with you a definition by Gartner which should add some clarity. Gartner characterizes virtualization as

a disruptive technology in IT infrastructure and operations”, which has the potential to substantially reduce the cost of IT and support business to embrace change, “and enables new models(styles) of computing such as cloud computing where massively scalable IT-enabled capabilities are delivered as a service to external customers using internet technologies”.

This distinction between a disruptive “technology” on the one hand and the “mode” in which companies obtain IT services on the other hand should be very helpful because it can explain perfectly the conditions for hype evolution – when upcoming technologies and economic parameters (recession) shake hands.

Of course, technology-wise virtualization is not a lone fighter. This is a lesson companies have learned painfully. It is more a “quadruple jump” composed of integration, virtualization, management and automation.

Have you tried? We can help you!

Read Full Post »

After 5 years of being the top CIO priority Business Intelligence has dropped in Gartner’s 2010 Executive Program survey to fifth place. Reason enough for Gartner analyst Mark McDonald to have a closer look and find out what has happened behind the curtains. And his conclusion is surprising and reassuring at the same time.

The good news is that BI is neither sick nor dead. The bad news for classical IT admins who still refuse to rack their brains with business related topics is that BI as technology is silently replaced by “Business Intelligence as a management capability” – which precisely reflects the shifting role of IT in general and that businesses nowadays need more than just technology to create value in complex, business driven environments.

“Creating a business intelligence capability demands a broader set of people, processes and tools that work together to raise intelligence, analytics and business performance … The move from technology to capability is about time as most organizations have already progressed through the peak of their CAPEX curve and moved from buying solutions to applying solutions.”

Nowadays, it seems more appropriate to talk about the “intelligent business” than “business intelligence”. Forget about the cube specialists in their ivory towers. Business Intelligence has left behind pure theory, while intelligent businesses have already arrived at the front-end fostering the seamless interaction between people, tools and processes.

That’s also why Josh Gingold and Scott Lowe from ZDNet can entitle their Live Webcast on the 20th of April: “The new ROI: Return on Intelligence” – proving that BI may have lost its position as a top technology priority according to the Gartner survey – but is increasing its importance to enterprise performance: “However, what many are now beginning to realize is that the return on an effective BI solution can actually exceed the cost of their entire IT budgets.”

The BI tools of UC4 are tightly coupled to process performance.  That’s our way to measure this new kind of ROI.
We invite you to have a look!

Read Full Post »

“According to a recent survey conducted by Vanson Bourne on behalf of UC4 Software, almost half of the 300 IT managers surveyed are planning to develop a hybrid IT environment – a blend of physical, virtual and cloud infrastructure. And two-thirds of respondents consider managing workloads in a hybrid environment challenging or very challenging.”

Talking about automation today means talking about dynamically executed end-to-end automation. That’s why serious automation efforts nowadays must consider dependencies not just between standard applications, databases, platforms and legacy applications, but also between the virtual and the real world.

According to Ken Jackson, in an article on Data Center Knowledge, that’s where Intelligent Service Automation rings in the next phase in the evolution of workload automation: “According to the UC4 survey, eight in 10 respondents say automating between hybrid environments is important or very important. Increased confidence in automation between hybrid environments would increase the use of virtualization in 84 percent of cases and increase the use of cloud computing in 81 percent of the cases.”

Companies nowadays are grids of functions, people and relations focused on business processes – and dependencies wherever you look. That’s why you need a lot of intelligence to keep processes controllable and visible. Intelligence means management – which in turn is the key to bridging. This is not about night shift jobs anymore. This is not about batch modalities. This is about the future of the cloud AND the future of automation.

Read Full Post »

Of course, December is always the time for predictions, especially when we are going to enter a new decade. No wonder that December also marks a time when new buzzwords are created. One of these is “UC4”. Don’t laugh! Silicon republic predicts that “UC4 is set to dominate the CIO’s agenda 2010”! You can imagine how I stumbled first when I read this headline. 😉 But seriously, what does the new year hold – besides “Unified Communication, Collaboration and Contact Centre” (UC4)?

I will not contribute to this discussion with another buzzword. I just want to predict that it will probably be above all the big year for the user. This goes close with Brian Duckering who predicts for 2010 that also “management methods shift from system-based to user-based: Managing systems has always worked just fine. But it has gotten a lot more complicated and costly as users become more mobile and less predictable, demanding that their workspaces follow them from one device to another, seamlessly. For many this has caused a re-evaluation of what the purpose of IT actually is. The systems don’t create value for companies – the users do. Yet, the tools and methods predominantly deployed target devices, not people.”

Take a look on the still maturing virtualization market. Forrester predicts that server virtualization will grow from 10% in 2007, to 31% in 2008 to 54% in 2011. It’s a pretty impressive growth rate, of course. But actually nothing compared to the explosion the Gartner Group expects for the amount of virtualized PCs in the same period; it will more than centuplicate – from 5 Mio in 2007 to 660 Mio in 2011.

2010 will possibly be the year when the hyping technologies around virtualization will hit the front-end – where the user is waiting. This can also cause trouble – especially without transparent process management. Because “virtualization without good management is more dangerous than not using virtualization in the first place” – as Tom Bittmann, Gartner Analyst, already put it in a nutshell a year ago.

Hope you are ready for 2010!

Read Full Post »

Have you ever heard about the Global Information Industry Center (GIIC)? It’s part of the University of San Diego – situated close to the place where UC4 customers gathered for the annual user conference some weeks ago? They just published a new 2009 Report on American Consumers (entitled “How Much Information?”) trying to create a census of all forms of information an average American consumes in a single day.

Want to guess how much??? It’s 34 gigabytes of content and 100,000 words of information in a single day.

The New York Times twists the knife in the wound, pointing out that this “doesn’t mean we read 100,000 words a day — it means that 100,000 words cross our eyes and ears in a single 24-hour period. That information comes through various channels, including the television, radio, the Web, text messages and video games.”

But why do we have this voracious appetite for information? The answer is maybe a whole lot simpler than you would think: Because what we mainly eat is instant data and not nutritious information! It seems time for a diet – even on the business side? Because business processes nowadays are accompanied by myriads of event driven data while at the same we have to govern them almost in real-time. In a situation like this data is not enough. What we need are digestible pieces of information combined with pattern recognition capabilities.

Our diet plan is simple. Less junk data and more information bites. If you want to know what we use in the kitchen, get some UC4 Insight on our web. You will like the taste.

Read Full Post »

It was Thomas Samuel Kuhn (1922-1996), one of the most influential philosophers of science of the twentieth century, who revolutionized our picture of The Structure of Scientific Revolutions in claiming that, according to the Paradigm Concept, a mature science experiences alternating phases of normal science and revolutions: “In normal science the key theories, instruments, values and metaphysical assumptions that comprise the disciplinary matrix are kept fixed, permitting the cumulative generation of puzzle-solutions, whereas in a scientific revolution the disciplinary matrix undergoes revision, in order to permit the solution of the more serious anomalous puzzles that disturbed the preceding period of normal science.”

What this quotation from the Stanford Encyclopedia of Philosophy conceals is that in normal science the questions are also kind of fixed and prefabricated and that during these periods scientists normally just raise questions about which they already know the answers.

Our situation is not normal at all. It is a situation of fundamental change – and this not just because of the crisis. Mark McDonald from Gartner knows this. And he knows about the importance of raising the right questions. Questions which can move us forward: “Great CIOs ask good questions pretty much all the time. A good question is one that creates knowledge and shares understanding. A good question makes both parties smarter. Most questions are not great questions. Helpful yes, but they simply exchange information from one side to the other.”

I don’t want to withhold from you the following rough typology of great questions that Mark McDonald gives …

Logic checking questions – If that is true, then these other things must be false?
• Implications based questions – So given this issue we are also seeing these other things happening?
• Proof of fact questions – so how do you know the issue is happening and what are the consequences?
• Forward looking questions – so given all of that, what are the next steps or how do you suggest we take?

… also because most process optimization efforts follow the same steps.

By the way, Mark McDonald recently did a whole series of posts about what makes a great CIO.

Read Full Post »

Older Posts »