Just reading Sandy Kemsley’s reporting on the Gartner BPM conference and in particular Jim Sinur on “dynamic BPM”. From the TIBCO perspective we definitely see “CEP” overlap with “BPM” here – mostly around business automation rather than workflow / general process modeling, although of course workflow events can be complex events too.
Sandy comments: A significant part of [dynamic BPM] is the inclusion of explicit rules within processes, so that scenario-driven rule sets can detect and respond to conditions, even without the process participants having to make those changes themselves… What used to be monolithic lumps of code can be split into several parts, each of which has the potential to be agile.
Reflecting on this, based on experience (i.e. customer production use cases with TIBCO BusinessEvents…)
- “Rules within processes” can be achieved though “processes defined as rules” (as well as the more usual practice of forcing the business analyst to separate process and rules via different tooling). Of course, “rules” (e.g. business decisions, event and process rules, etc) can be themselves be represented in multiple ways: we find the UML State Model (for entity lifecycle modeling) very useful for defining processes and as a (state transition) rule representation.
- “Monolithic lumps of code” could be interpreted as “monolithic process diagrams” in a BPM context. The real benefit comes from the model-driven approach: combining multiple models to provide agility and flexibility at the level (business or IT) required. The important thing here is that “one model to rule them all” (e.g. BPMN) doesn’t work …. yet.
TIBCO implements dynamic BPM in a number of ways, though:
- control workflows via event processing (event-driven BPM)
- separate processes into automated / dynamic (via event processing – event-based BPA) and manual / workflow (via BPM)