Feeds:
Posts
Comments

Archive for the ‘Process Automation’ Category

After 5 years of being the top CIO priority Business Intelligence has dropped in Gartner’s 2010 Executive Program survey to fifth place. Reason enough for Gartner analyst Mark McDonald to have a closer look and find out what has happened behind the curtains. And his conclusion is surprising and reassuring at the same time.

The good news is that BI is neither sick nor dead. The bad news for classical IT admins who still refuse to rack their brains with business related topics is that BI as technology is silently replaced by “Business Intelligence as a management capability” – which precisely reflects the shifting role of IT in general and that businesses nowadays need more than just technology to create value in complex, business driven environments.

“Creating a business intelligence capability demands a broader set of people, processes and tools that work together to raise intelligence, analytics and business performance … The move from technology to capability is about time as most organizations have already progressed through the peak of their CAPEX curve and moved from buying solutions to applying solutions.”

Nowadays, it seems more appropriate to talk about the “intelligent business” than “business intelligence”. Forget about the cube specialists in their ivory towers. Business Intelligence has left behind pure theory, while intelligent businesses have already arrived at the front-end fostering the seamless interaction between people, tools and processes.

That’s also why Josh Gingold and Scott Lowe from ZDNet can entitle their Live Webcast on the 20th of April: “The new ROI: Return on Intelligence” – proving that BI may have lost its position as a top technology priority according to the Gartner survey – but is increasing its importance to enterprise performance: “However, what many are now beginning to realize is that the return on an effective BI solution can actually exceed the cost of their entire IT budgets.”

The BI tools of UC4 are tightly coupled to process performance.  That’s our way to measure this new kind of ROI.
We invite you to have a look!

Read Full Post »

Nowadays, enterprises are more and more often a result of mixing physical, virtual and cloud environments. And therefore a single point of management is a prerequisite for meeting SLAs and ensuring that business processes crossing platform, application and even physical borders are completed on time.
The funny thing is: as long as we lack visibility we are still thinking in terms of hurdles and obstacles. But at the moment we can manage physical, virtual and even cloud resources and applications from one pane of glass, we can outpace the disruptions and unify multiple jobs into one coherent process flow.

But even then we are not on target. Because at the very moment we achieve this coherence another effect appears – a boosting performance, made out of end-to-end visibility, seamless workload distribution and unprecedented processing power. Having a closer look at these columns of intelligent service automation one might be reminded of another concept – a connection I hit upon by Theo Priestley, an independent analyst and BPM visionary:

“Who remembers SETI@home, the project run by SETI to harness internet connected PC’s across the globe to help analyse signals from space? It was an early and successful attempt at mass distributed (or grid) computing using a small piece of software to use latent CPU cycles on client machines when the screensaver was engaged.

Now jump forward and the question is why hasn’t anyone taken this concept into the enterprise and into the BPM world itself? If you can imagine the many desktops that exist in an organisation sitting fairly idle when they could act as a BPM grid project to;

  • analyse, predict and act upon real-time data,
  • alter business rules on the fly,
  • creating intelligent workflow,
  • perform background simulation and CEP

Why bother with expensive server hardware (and future upgrades etc) when there’s potentially far more power sitting across the organisation not being fully utilised? Are there any examples of this in the BPM industry currently, if so would be good to hear about it.”

Yes Theo, there are examples – potential case studies are queuing up in front of our doors. It seems to me that we randomly adapted this GRID concept to the enterprise. Anyway, technologically we are ready.

Read Full Post »

“According to a recent survey conducted by Vanson Bourne on behalf of UC4 Software, almost half of the 300 IT managers surveyed are planning to develop a hybrid IT environment – a blend of physical, virtual and cloud infrastructure. And two-thirds of respondents consider managing workloads in a hybrid environment challenging or very challenging.”

Talking about automation today means talking about dynamically executed end-to-end automation. That’s why serious automation efforts nowadays must consider dependencies not just between standard applications, databases, platforms and legacy applications, but also between the virtual and the real world.

According to Ken Jackson, in an article on Data Center Knowledge, that’s where Intelligent Service Automation rings in the next phase in the evolution of workload automation: “According to the UC4 survey, eight in 10 respondents say automating between hybrid environments is important or very important. Increased confidence in automation between hybrid environments would increase the use of virtualization in 84 percent of cases and increase the use of cloud computing in 81 percent of the cases.”

Companies nowadays are grids of functions, people and relations focused on business processes – and dependencies wherever you look. That’s why you need a lot of intelligence to keep processes controllable and visible. Intelligence means management – which in turn is the key to bridging. This is not about night shift jobs anymore. This is not about batch modalities. This is about the future of the cloud AND the future of automation.

Read Full Post »

People like to think in “either-or”solutions, trying to make their lives easier, and maybe unconsciously trusting the paradox that more choices may lead to a poorer decision (Paradox of Choice). This is in spite of a reality which often proves to follow a more fuzzy and compromising “both-and” logic.

Take the hype about cloud computing. Although the world is full of cloud apologists nowadays one should bear in mind that the cloud market is still nascent – with, so far, only 4 percent of small and enterprise companies from the United States and Europe taking advantage of it, which means, from a management point of view, that in the nearer future we will have to deal with the reality of hybrid environments and even more processes connecting physical, virtual and/or cloud-based platforms.

Bob Muglia, president of Microsoft’s Server and Tools Business, proves that serious cloud providers like Microsoft share this view: “There’s more than one flavor of cloud computing, including private clouds that run on a business’s on-site servers. And it needn’t be an all-or-nothing proposition; we expect customers to want to integrate on-premises datacenters with an external cloud”.

This reality one needs to have in mind when it comes to evaluate the new “Agent for Web Services” UC4 unveiled some weeks ago. In a conversation I had with Vincent Stueger, Chief Technology Officer, UC4 Software last week, he told me why this Agent is “a really big and promising step. Because it’s much more than offering a simple web interface to remotely start a job or change a parameter. With this agent you can seamlessly monitor and control a process from the ground to the cloud and back.”

If Milind Govekar, Research Vice President, Gartner, is to be believed, this bridging capability will not just decide on the future of automation, but also on the future of the cloud: “The ability of the technology platform to manage a hybrid environment consisting of legacy and cloud applications with a single automation platform will be needed for clearing the way for greater adoption of cloud-based models.”

The cloud is not our destiny, but it brings a big choice – if we are able to provide the bridges.

Read Full Post »

Don’t worry, I don’t want to open a new chapter in the “chicken or the egg” causality dilemma. But when I stumbled upon an argument by Bernard Golden – the author of the famous book: Virtualization for Dummies – I was briefly reminded of a dead-end street called “Business/IT alignment” we walked down some months ago.

Forget the wall separating IT and business. There is no such thing. It’s not true anymore that business pre-exists and IT is just a representation of what is happening on the business side. Therefore any “paving the cow paths” approach of computing will be a long shot, as Golden emphasizes in his brilliant article:

“In the past, IT was used to automate repeatable business processes — taking something that already exists and computerizing it. The archetype for this kind of transformation is ERP — the automation of ordering, billing, and inventory tracking. That “paving the cow paths” approach to computing is changing. Today, businesses are delivering new services infused and made possible by IT — in other words, creating new offerings that could not exist without IT capabilities.”

What Golden describes here is the end of the reaction model of IT as we know it – that someone acts and IT reacts. It’s not a pulling approach anymore it’s a pushing approach – where location-sensitive devices and mashed-up applications interact with each other as part of data driven processes.

Dynamization changes not only the application architecture, but also the requirements service-aware automation technologies must meet. In these highly variable environments applications will – according to Golden – need to “dynamically scale” “to gracefully and dynamically add new data streams as inputs” and “to rapidly shift context and data sets.

Does this new variability sound familiar to you? No wonder, it’s the linchpin of UC4’s Intelligent Service Automation. Workloads need to be distributed dynamically and out of the process because events are the heartbeats of modern applications. That’s why workloads neither adhere to system borders nor to business hours. They just don’t care.

Read Full Post »

It was Thomas Samuel Kuhn (1922-1996), one of the most influential philosophers of science of the twentieth century, who revolutionized our picture of The Structure of Scientific Revolutions in claiming that, according to the Paradigm Concept, a mature science experiences alternating phases of normal science and revolutions: “In normal science the key theories, instruments, values and metaphysical assumptions that comprise the disciplinary matrix are kept fixed, permitting the cumulative generation of puzzle-solutions, whereas in a scientific revolution the disciplinary matrix undergoes revision, in order to permit the solution of the more serious anomalous puzzles that disturbed the preceding period of normal science.”

What this quotation from the Stanford Encyclopedia of Philosophy conceals is that in normal science the questions are also kind of fixed and prefabricated and that during these periods scientists normally just raise questions about which they already know the answers.

Our situation is not normal at all. It is a situation of fundamental change – and this not just because of the crisis. Mark McDonald from Gartner knows this. And he knows about the importance of raising the right questions. Questions which can move us forward: “Great CIOs ask good questions pretty much all the time. A good question is one that creates knowledge and shares understanding. A good question makes both parties smarter. Most questions are not great questions. Helpful yes, but they simply exchange information from one side to the other.”

I don’t want to withhold from you the following rough typology of great questions that Mark McDonald gives …

Logic checking questions – If that is true, then these other things must be false?
• Implications based questions – So given this issue we are also seeing these other things happening?
• Proof of fact questions – so how do you know the issue is happening and what are the consequences?
• Forward looking questions – so given all of that, what are the next steps or how do you suggest we take?

… also because most process optimization efforts follow the same steps.

By the way, Mark McDonald recently did a whole series of posts about what makes a great CIO.

Read Full Post »

“Over the next five years, IT automation will overtake offshoring as the next major efficiency trend in IT.” This is how Ken Jackson, President of Americas at UC4, starts his article about The Dawning of the IT Automation Era. This is surprising just for those who either consider offshoring as the universal answer to all cost reduction challenges or think that cost reduction is the only target of IT automation.

But in a world where IT environments are becoming more and more complex “squeezing every bit of extra cost out of your IT budget” and therefore leaving IT professionals with a “bare bones operating plan” is a tactic which is not sustainable at all. It`s like engaging in a rulebook slowdown while neglecting the fact that IT can really boost your business and ensure accurate service delivery.

The truth behind this is simple: you need money to invest in cost saving technologies. Because keeping IT systems just up and running is not enough. If you don’t want to throw the baby out with the bath water you have to jointly develop business innovation abilities and cost avoidance strategies.

The answer to complexity is process visibility, and a combination of real-time intelligence and just-in-time execution. This will help you to squeeze instead every bit of value out of your IT budget.

And this is what it’s all about.

For more information on the “several factors contributing to the coming age of IT automation” read Ken Jacksons inspiring article.

Read Full Post »

Older Posts »