Feeds:
Posts
Comments

Archive for the ‘Just-In-Time’ Category

When talking about automation, people easily ignore the power of change and consider the contemplated processes as engraved in stone. In spite of the fact that “change is not new and change is natural“, as Thomas L. Friedman (The World is Flat) pointed out in his thought-provoking book:“Change is hard. Change is hardest on those caught by surprise. Change is hardest on those who have difficulty changing too.”

Talking about change means talking about events – the secret currency of change counting any single change of state. This is worth emphasizing because events are not only the drivers of today’s businesses and operations, but they can occur everywhere – crossing platform, departmental and even enterprise borders.

Today you´re managing dynamic IT environments which are complex blends of physical, virtual, or cloud-based resources. In such environments transparency is key to staying agile and responsive. But even being reactive is not enough to keep your business situationally aware. To ensure that the processes are up-to-date and the engine is not automating errors and detours, any automation effort must be accompanied by an ongoing optimization effort.

The crux is that reaction and analysis are meshing. Take lunch break at school as real world example: the bell is ringing and 10 seconds later everyone stands in the line of the cafeteria waiting to be served. Following the classical monitoring way, cooking would start when the bell rings. Knowing more about the processes in the kitchen, the guys from UC4 start cooking 2 hours before – so everything is ready when the children come.

This kind of processing intelligence is key to avoiding overheads and running automated environments in a cost- and SLA-conscious way. Knowing the processes in school, the ringing bell is a foreseeable event. So you better not focus on reducing the reaction time and waste time and money. Otherwise it makes a lot of sense to monitor the cooking process as close to real-time as possible. It ensures that you have all the processing options available – before the bell rings!

Knowing that change is a constant not a variable and that automation can only be effective if it is combined with intelligence, UC4´s Application Assurance solution incorporates real-time data, insight into the complete end-to-end business or IT processes, and intelligent decision making.

Have a look. It’s worth it!

Read Full Post »

The Gartner Symposium/ITxpo 2009 we have been attending in Orlando not only endorsed the big hypes aroung virtualization and cloud computing, but also our ongoing investments in service-aware process automation – offering real-time intelligence for just-in-time execution. It matched perfectly that Gartner analyst Roy Schulte and K. Mani Chandy, Professor at the California Institute of Technology in Pasadena, used this event to introduce their brand new book called “Event Processing: Designing IT Systems for Agile Companies” about the business drivers, costs and benefits of event-processing applications.

According to Mr. Schulte and Mr. Chandy the new aspirations in situation awareness and reaction accuracy can`t be achieved by simply speeding up traditional business processes or exhorting people to work harder and smarter with conventional applications. Instead they urge companies to make fundamental changes in the architecture of business processes and the application systems that support them by using more of the event-processing discipline. “While a typical business process has time-driven, request-driven and event-driven aspects, event-driven architecture (EDA) is underutilized in system design resulting in slow and inflexible systems,” said Mr. Chandy. “Event-driven systems are intrinsically smart because they are context-aware and run when they detect changes in the business world rather than occurring on a simple schedule or requiring someone to tell them when to run.”

“Event-driven CEP is a kind of near real-time business intelligence (BI), a way of `connecting the dots` to detect threats and opportunities,” explained Mr. Schulte. “By contrast, conventional BI is time-driven or request-driven. Complex events may be reactive, summarizing past events or predictive, identifying things that are likely to happen based on what has happened recently compared with historical patterns.”

Nothing to add. UC4 can deliver!

Read Full Post »

When people talk about automating processes, quite an important aspect is often ignored, namely change. In spite of the reality that today’s IT environments – exactly like the markets that they are made for – are continually changing, and processes are dynamically initiated and implemented.

At any given opportunity we emphasize that change is the only constant, but do we really estimate the consequences? UC4 does, certainly. It is proven and confirmed by today`s acquisition of SENACTIVE and the consolidation of a longstanding partnership. We talked to SENACTIVE`s CTO Josef Schiefer.

Mr. Schiefer, how do you deal with the reality that even automatic processes have errors and are not always running smoothly?

Josef Schiefer, CTO Senactive

Josef Schiefer, CTO Senactive

JSchiefer: We response in the shape of our graphical analysis tools which are absolutely key to error detection and optimization strategies. All the data accumulated or provided by processes have to be continually visualized, analyzed and updated. On the one hand because client behaviour patterns and requirements change, on the other hand, because these days, dealing with exceptions and specifics are of the utmost importance.

What does the successful optimization of a process depend on?

JSchiefer: The current state analysis of the process workflow is critical. This tests which resources are used by the process, which operations belong to the process and how these are related to each other. To get a holistic view of the process, this analysis should ideally be performed graphically. After this, all parts of a process are systematically tested for existing weaknesses. The identified weaknesses give the vital impetus for optimization.

UC4 emphasises the budget-friendly concept of IN-TIME-processing: “to deliver the right information at the right time.” You have a similar approach. Could you explain, from your perspective, the relationship between in-time and real-time?

JSchiefer: Here it is important to keep two things separate: 1) If we are talking about monitoring processes, it needs to be done in “real-time” as far as possible. “In-time” just means that the estimates for the process operation come from the client. This is because it is the client that decides whether the answer was prompt or not. According to this, a process can then be handled in batch mode, and the result can get to the client on time despite this. In such a case it would be insane, or rather the costs would be totally over budget if the process was dealt with in real-time mode.

But the question is then, who decides each time how a process will be handled …

JSchiefer: That is exactly the point. This decision about the mode of processing cannot be made at any time, but rather it has to be made promptly, preferably in real-time so that I have all the processing options available and can make cost-conscious decisions. Because one thing is certain: true real-time handling only makes sense for chosen processes. And that is processes that are core to the business, or rather, that cause problems for the client when they are delayed. To get to the point, you have to observe the processes in real-time to implement the right process steps at the right moment or in-time.

Ok, so: Monitoring as close to real-time as possible. Processing when necessary.

JSchiefer: Precisely. But that is just one aspect – the time critical side of the analysis, where quick and precise decisions are imperative: that’s called UC4 Decision (previously SENACTIVE In-Time). The other side is optimization, as we have already mentioned: that is UC4 Insight (previously SENACTIVE Event Analyzer). You can compare UC4 Insight with a data mining tool, with which you can take specific processes out to look at under the microscope. That is the detective-like part of the analysis work, in which the intuitive visualization techniques help with the recognition of patterns and identification of delays. This visualization of event flows allows you to explore and visually experience complex relationships and the cause of exceptions.

You can virtually “fly through” the event flow and identify idle operation potential visually. Crucially, because the analysis of the data is not based in the process database but in an event database, it can continually be implemented without overloading the productive system. By the way, this is not about single events, but the correlation of events, through which modern business processes are described.

To read the interview in full length please go to http://senactive.uc4.com

Read Full Post »

As long as process execution runs smoothly nobody cares what happens in the event tunnel. But in the moment of an accident a narrow and murky tunnel becomes a big threat to people and businesses. Providing analyzing tools for subsequent insights and predictive simulations brings light into this tunnel, of course, but in the moment the traffic continues, the tunnel will be dark again.

With hundreds of thousands, maybe millions of events a day this picture is really threatening. That’s why the connection between the automation engine and the user is so essential nowadays, as we pointed out some weeks ago. On the other hand is ‘complex event processing’(CEP) “still a term that scares people away”, as Joe McKendrick points out in a ZDNet article, searching for “a softer way to describe what this thing is” about.

But what if CEP – or EP to soothe your nerves – is no less than key to the situational awareness of your business? A concept which is summarized by David E. Olson in a very interesting Event Processing Roundtable at ebizQ: “When you talk about situational awareness, you have three aspects of that continuum. We’ve got the past, the future and the present. What CEP does is add a significant amount of intelligence in the present, so that the business can act in the moment, and improve decision making in the future …”

It’s all about making your enterprise smart and agile. Complex Event Processing is designed to handle, normalize, aggregate, and analyze a wide range of information in real-time. You think that’s impossible considering hundreds of thousands events per day? Good point! But that’s part of the homework any event processing effort has to fulfill: to discern if a thing that happened is notable. Setting up this filter is a bit like searching for a needle in a haystack. But at the end of the day you will find out that there are just a handful of events remaining that are actually impacting on your business.

Read Full Post »

Even if the concept of a Service Orientated Architecture (SOA) looks revolutionary compared with the antique appearance of a batch mode culture, it’s not drawing a line under background processing. The remaining dependency is illustrated by the fact, that over 50% of all applications still perform their processing in a background manner.

callisimportantThis is possible through the existence of Workload Automation tools bridging between SOA and legacy applications and ensuring that the user/client gets the needed information in time. Of course, batch is not real-time. But the automation engine that supports SOA processing initiation in background is establishing the required connection to process jobs under the SOA-umbrella.

You mean, “in-time” is a woolly concept? No way! Look at the user/client. He defines it – and not just in this cartoon.

You will find the details about how “Workload Automation works with SOA” in this Whitepaper. For more cartoons go to www.CartoonStock.com where we also downloaded and licensed this one.

Read Full Post »

In last week’s New Yorker feature Malcolm Gladwell is retelling the story from David and Goliath finding ways to raise Davids winning percentage to remarkable 63,6%. The subtext is a portrait of Vivek Ranadivé, CEO and founder of TIBCO, and the move from batch to real time as a sort of holy mission.

“We’ve been working with some airlines,” he said. “You know, when you get on a plane and your bag doesn’t, they actually know right away that it’s not there. But no one tells you, and a big part of that is that they don’t have all their information in one place. There are passenger systems that know where the passenger is. There are aircraft and maintenance systems that track where the plane is and what kind of shape it’s in. Then, there are baggage systems and ticketing systems—and they’re all separate. So you land, you wait at the baggage terminal, and it doesn’t show up.”

Everything bad that happens in that scenario, Ranadivé maintains, happens because of the lag between the event (the luggage doesn’t make it onto the plane) and the response (the airline tells you that your luggage didn’t make the plane). The lag is why you’re angry. The lag is why you had to wait, fruitlessly, at baggage claim. The lag is why you vow never to fly that airline again.

Indeed, this example is a good one. But on taking a closer look, you will find out that it doesn’t necessarily illustrate the advantages of real-time-processing. What we are looking at, is a situation in which people who require a certain information don’t have to wait for it. But not having to wait doesn’t mean necessarily “real-time”. It only suggests that the information arrives “in-time” – in the moment the person needs it.

The fact that “just-in-time” can be real-time in certain cases, but doesn’t have to be real-time in many others is important. Especially for all the Davids outside who cannot afford to over provision their IT infrastructure? They have to examine each process inside their organization to identify “the point in time” where they can determine that the business needs a particular piece of information.

And doing this, they will find out that the business requirements are quite different, even in between one vertical. A high end clothing retailer does not require point of sale information to be uploaded and processed until after the stores have closed, so they don’t need constant refreshing of information during the day. A food retailer, on the other hand, may need to take a look at his point of sale information at every hour, because they have to do merchandising and the delivery of goods on an hourly, or half day basis.

Examing means validating. And validating is the precondition to automate processes at the most effective cost point. There is a related cost inherent to the process response time. And there is a related value inherent to every piece of information which is at the right time where people need it.

Read Full Post »

There was a time when everybody talked about ‘real-time’. You couldn’t speak highly of something without stressing that it was almost or close or already ‘real-time’. Do you remember the 70s in the car industry? I do. And I remember the similarities as well. It was a time when no kid could help looking at the speedometer when passing a car. It wasn’t just my childhood, but also the childhood of car industry. A time of immaturity – when technology was praising itself. Going back to my starting point: Looking at the speedometer is a bit like talking about ‘real-time’. What comes off badly, is the user (the driver).

Because the mature driver doesn’t care about 280 miles per hour on the speedometer as long as he is travelling on streets with a speed limit of 80 or 100 miles. But he does care about security, travelling comfort, fuel consumption etc. He is like the business user, who doesn’t care about real-time. What he wants is ‘just-in-time’: that he gets the information he needs at the time he needs it. No more, no less.

That’s also your company’s challenge, when it comes to automating across different people, platforms, systems and applications. You don’t want to over or under provision your IT infrastructure as there is a related cost to doing this. What you want is to effectively deliver the right business information at the right time at the lowest possible cost. Why deliver at real-time, when there is no need and nobody cares?

Take Amazon or other online selling companies: A consumer ordering a product finds it more than acceptable to receive an email order confirmation a few minutes after having placed their order. The consumer will get no added value if the email arrives within 1 millisecond, but will certainly be disappointed if the email arrives 24 hours later. Hence online buying and selling is a classic example of using ‘Just-In-Time’ processing to satisfy consumer needs and keep automation costs to a minimum.

In fact, there is a parallel to the manufacturing industry with the familiar technique of ‘Just-in-time’ manufacturing that revolutionized this industry. Why shouldn’t the same approach be able to revolutionize a business’ automation strategy – receiving maximum value from automating its processes at the most effective cost point?

Read Full Post »