Feeds:
Posts
Comments

Archive for the ‘Cloud Computing’ Category

“Sometimes industry hypes can reciprocally enforce each other and sometimes they even coexist so closely that the question: “Who was first?” verges on the “chicken or the egg causality dilemma”. With “Virtualization” and “Cloud Computing” it’s different. Of course, they do “hype” each other but the concept of cloud computing is not even thinkable without having virtualization technologies in mind.”

That’s how I started a blogpost in October about an EMA White Paper on Virtualization Control through Intelligent Service Automation, written by former EMA analyst (now CA employee) Andi Mann. Notwithstanding this I still feel that many people compare apples and oranges when talking about virtualization and cloud computing.

Reason enough for me to share with you a definition by Gartner which should add some clarity. Gartner characterizes virtualization as

a disruptive technology in IT infrastructure and operations”, which has the potential to substantially reduce the cost of IT and support business to embrace change, “and enables new models(styles) of computing such as cloud computing where massively scalable IT-enabled capabilities are delivered as a service to external customers using internet technologies”.

This distinction between a disruptive “technology” on the one hand and the “mode” in which companies obtain IT services on the other hand should be very helpful because it can explain perfectly the conditions for hype evolution – when upcoming technologies and economic parameters (recession) shake hands.

Of course, technology-wise virtualization is not a lone fighter. This is a lesson companies have learned painfully. It is more a “quadruple jump” composed of integration, virtualization, management and automation.

Have you tried? We can help you!

Advertisements

Read Full Post »

Nowadays, enterprises are more and more often a result of mixing physical, virtual and cloud environments. And therefore a single point of management is a prerequisite for meeting SLAs and ensuring that business processes crossing platform, application and even physical borders are completed on time.
The funny thing is: as long as we lack visibility we are still thinking in terms of hurdles and obstacles. But at the moment we can manage physical, virtual and even cloud resources and applications from one pane of glass, we can outpace the disruptions and unify multiple jobs into one coherent process flow.

But even then we are not on target. Because at the very moment we achieve this coherence another effect appears – a boosting performance, made out of end-to-end visibility, seamless workload distribution and unprecedented processing power. Having a closer look at these columns of intelligent service automation one might be reminded of another concept – a connection I hit upon by Theo Priestley, an independent analyst and BPM visionary:

“Who remembers SETI@home, the project run by SETI to harness internet connected PC’s across the globe to help analyse signals from space? It was an early and successful attempt at mass distributed (or grid) computing using a small piece of software to use latent CPU cycles on client machines when the screensaver was engaged.

Now jump forward and the question is why hasn’t anyone taken this concept into the enterprise and into the BPM world itself? If you can imagine the many desktops that exist in an organisation sitting fairly idle when they could act as a BPM grid project to;

  • analyse, predict and act upon real-time data,
  • alter business rules on the fly,
  • creating intelligent workflow,
  • perform background simulation and CEP

Why bother with expensive server hardware (and future upgrades etc) when there’s potentially far more power sitting across the organisation not being fully utilised? Are there any examples of this in the BPM industry currently, if so would be good to hear about it.”

Yes Theo, there are examples – potential case studies are queuing up in front of our doors. It seems to me that we randomly adapted this GRID concept to the enterprise. Anyway, technologically we are ready.

Read Full Post »

“According to a recent survey conducted by Vanson Bourne on behalf of UC4 Software, almost half of the 300 IT managers surveyed are planning to develop a hybrid IT environment – a blend of physical, virtual and cloud infrastructure. And two-thirds of respondents consider managing workloads in a hybrid environment challenging or very challenging.”

Talking about automation today means talking about dynamically executed end-to-end automation. That’s why serious automation efforts nowadays must consider dependencies not just between standard applications, databases, platforms and legacy applications, but also between the virtual and the real world.

According to Ken Jackson, in an article on Data Center Knowledge, that’s where Intelligent Service Automation rings in the next phase in the evolution of workload automation: “According to the UC4 survey, eight in 10 respondents say automating between hybrid environments is important or very important. Increased confidence in automation between hybrid environments would increase the use of virtualization in 84 percent of cases and increase the use of cloud computing in 81 percent of the cases.”

Companies nowadays are grids of functions, people and relations focused on business processes – and dependencies wherever you look. That’s why you need a lot of intelligence to keep processes controllable and visible. Intelligence means management – which in turn is the key to bridging. This is not about night shift jobs anymore. This is not about batch modalities. This is about the future of the cloud AND the future of automation.

Read Full Post »

People like to think in “either-or”solutions, trying to make their lives easier, and maybe unconsciously trusting the paradox that more choices may lead to a poorer decision (Paradox of Choice). This is in spite of a reality which often proves to follow a more fuzzy and compromising “both-and” logic.

Take the hype about cloud computing. Although the world is full of cloud apologists nowadays one should bear in mind that the cloud market is still nascent – with, so far, only 4 percent of small and enterprise companies from the United States and Europe taking advantage of it, which means, from a management point of view, that in the nearer future we will have to deal with the reality of hybrid environments and even more processes connecting physical, virtual and/or cloud-based platforms.

Bob Muglia, president of Microsoft’s Server and Tools Business, proves that serious cloud providers like Microsoft share this view: “There’s more than one flavor of cloud computing, including private clouds that run on a business’s on-site servers. And it needn’t be an all-or-nothing proposition; we expect customers to want to integrate on-premises datacenters with an external cloud”.

This reality one needs to have in mind when it comes to evaluate the new “Agent for Web Services” UC4 unveiled some weeks ago. In a conversation I had with Vincent Stueger, Chief Technology Officer, UC4 Software last week, he told me why this Agent is “a really big and promising step. Because it’s much more than offering a simple web interface to remotely start a job or change a parameter. With this agent you can seamlessly monitor and control a process from the ground to the cloud and back.”

If Milind Govekar, Research Vice President, Gartner, is to be believed, this bridging capability will not just decide on the future of automation, but also on the future of the cloud: “The ability of the technology platform to manage a hybrid environment consisting of legacy and cloud applications with a single automation platform will be needed for clearing the way for greater adoption of cloud-based models.”

The cloud is not our destiny, but it brings a big choice – if we are able to provide the bridges.

Read Full Post »

Don’t worry, I don’t want to open a new chapter in the “chicken or the egg” causality dilemma. But when I stumbled upon an argument by Bernard Golden – the author of the famous book: Virtualization for Dummies – I was briefly reminded of a dead-end street called “Business/IT alignment” we walked down some months ago.

Forget the wall separating IT and business. There is no such thing. It’s not true anymore that business pre-exists and IT is just a representation of what is happening on the business side. Therefore any “paving the cow paths” approach of computing will be a long shot, as Golden emphasizes in his brilliant article:

“In the past, IT was used to automate repeatable business processes — taking something that already exists and computerizing it. The archetype for this kind of transformation is ERP — the automation of ordering, billing, and inventory tracking. That “paving the cow paths” approach to computing is changing. Today, businesses are delivering new services infused and made possible by IT — in other words, creating new offerings that could not exist without IT capabilities.”

What Golden describes here is the end of the reaction model of IT as we know it – that someone acts and IT reacts. It’s not a pulling approach anymore it’s a pushing approach – where location-sensitive devices and mashed-up applications interact with each other as part of data driven processes.

Dynamization changes not only the application architecture, but also the requirements service-aware automation technologies must meet. In these highly variable environments applications will – according to Golden – need to “dynamically scale” “to gracefully and dynamically add new data streams as inputs” and “to rapidly shift context and data sets.

Does this new variability sound familiar to you? No wonder, it’s the linchpin of UC4’s Intelligent Service Automation. Workloads need to be distributed dynamically and out of the process because events are the heartbeats of modern applications. That’s why workloads neither adhere to system borders nor to business hours. They just don’t care.

Read Full Post »

“Cloud computing is not just one more way to deploy information systems. It represents a total shift in how IT resources are delivered and ultimately will replace most if not all internally-maintained IT infrastructure.” This is how Franc Scavo starts his latest blogpost on “the inexorable dominance of cloud computing” and the raise of utility computing – inspired by a speech by Nicholas Carr at a Cloud Computing conference organized last week in London by Google.

This is, by the way, the same Nicholas Carr who shook the world of many IT managers and CIOs when he published his article “IT doesn’t matter” in the Harvard Business Review in May 2003 – trying to make the case that, from a strategic standpoint, infrastructural technologies will commoditize and become more and more invisible.

Nowadays, and some years later, we´re over this initially offending perspective. We are able to see that the value propositions of SaaS and Cloud computing strategies are significantly better than those of on-premise software. And that both SaaS and the Cloud rely on integrated technologies from an automated backbone. The more invisible, the more mature. The more mature, the better.

It was Nicholas Carr who coined the matching Cloud koan in his blog: “Not everything will move into the cloud, but the cloud will move into everything.”

His 30 minute talk you can view here:

Read Full Post »

Sometimes industry hypes can reciprocally enforce each other and sometimes they even coexist so closely that the question: “Who was first” is verging on the “chicken or the egg causality dilemma”. With “Virtualization” and “Cloud Computing” it’s different. Of course, they do “hype” each other but the concept of cloud computing is not even thinkable without having virtualization technologies in mind. A concept which is defined by the U.S. National Institute of Standards and Technology (NIST) as “a model for enabling convenient, in-demand network access to a shared pool of configurable computing resources … that can be rapidly provisioned and released with minimal management effort or service provider interaction.”

I found this definition whilst going through a brand new EMA White Paper titled “Achieving Virtualization Control with Intelligent Service Automation”. In this study EMA researcher Andi Mann develops the supporting argument that an efficient use and dynamic provisioning of resources depends on service-aware automation technologies.

Workloads don’t care whether they are happening on physical or virtual systems. That’s why – according to Andi Mann – an automated virtual service management is the basis of any serious cloud approach. “It sets the stage for well-managed cloud computing services, with two essential components. Firstly, the virtual infrastructure provides a turnkey approach to flexibility, agility, and scalability, and the essential convenient, on-demand configuration of shared computing resources. This is difficult, and perhaps impossible, with a tightly-coupled physical environment. Secondly, intelligent and sophisticated automation is essential to ensuring minimal management effort or service provider interaction, which would be impossible with a manual management approach.”

And that’s why the UC4 Intelligent Service Automation platform is an important building block for cloud computing: 1) It initiates the dynamic distribution of workloads out of the process. 2) It immediately integrates newly-provisioned systems into your daily recurring housekeeping routines for backup and maintenance. 3) It introduces real-time intelligence to Cloud-Computing by acting on not reacting to events inside the applications. And finally 4) it provides an end-to-end view for predictive process management, integrating real and virtual worlds.

Read Full Post »

Older Posts »