Paul Barsch (from Teradata’s Marketing Department) recently posted a blog on the acceleration of decision-making speeds in business, which is likely a concern to all software companies that have grown up around, and are tied to, traditional data-driven business intelligence and reporting. Paul specifically references an MIT Technology Review article and the “science of event processing” allowing responses in milliseconds – in other words, extremely low latency decisions.
The challenge with (1) speedy decisions is (2) ensuring accuracy – including regulatory compliance and matching decisions to business strategy etc – while achieving (3) low cost – taking into account both development and deployment – per decision and decision type. Combining all 3 is tricky – you need event processing for the speed, capabilities like decision management / inference rules / visualization tools / analytics for accuracy, and poweful integrated software development techniques to lower costs. In the past IT teams have had to get by with reduced capabilities here, such as not really achieving (1) through needing to rely on a traditional database with its inherant file access transaction times, along with a traditional application server doing simple sequential processing of events. But nonetheless it is this combination of speed / accuracy / cost that is driving the development of Complex Event Processing technologies such as:
- event-driven rule-based distributed event processing – typified by products like TIBCO BusinessEvents
- distributed data sources for high performance data access – typified by developments such as TIBCO ActiveSpaces
- high performance middleware solutions – a long-term TIBCO speciality with TIBCO EMS and Rendezvous.
Notes: see also JT’s ebizQ comments on this blog posting.