The “big elephant in the room” in the ongoing CEP dialog is that most of the current (CEP) software on the market is not capable of machine learning and statistical analysis of dynamic real-time situations. Software vendors have been promoting and selling business process automation solutions and calling this approach “CEP” when, in fact, nothing is new. There is certainly no “technology leap” in these systems, as sold today.
The development of the Internet in recent years has made it possible and useful to access many different information systems anywhere in the world to obtain information. While there is much research on the integration of heterogeneous information systems, most commercial systems stop short of the actual integration of available data. Data fusion is the process of fusing multiple records representing the same real-world object into a single, consistent, and clean representation.
I believe that the general consensus among those who study this kind of thing, is that any decision made wholly by a computer is an operational decision, even if it affects the behavior/tasks of many people or sub-components. Online decisions, being a subset of automated decisions, would then be operational in nature.
Rule-processing is just a style of computation. Of course it is used in BRMS, but it is also used in CEP. CEP systems typically employ rules-based processing to infer higher-order events by matching patterns across many event streams within the event ‘cloud’. BRMS’s use rule processing to match patterns within data tuples representing business-orientated data. CEP systems may support the use of advanced analytics to manage predictive analysis, reasoning under uncertainty and other requirements in relation to the event cloud. Some of the better BRMS’s offer similar analytics in regard to processing business data.
I don’t know whether you said that a CEP application must necessarily have a model. It may have, or it may not. A rule-based approach (in its general acceptation) is not considered as a model. In the AI terminology, rules are considered as “shallow knowledge”, while models are considered as “deep knowledge”. Shallow knowledge expresses the people’s experience, links symptoms to causes directly, while deep knowledge establishes the links using a model, and the model can be interpreted. Shallow knowledge is very helpful in many cases, and as deep knowledge it also allows detecting situations. Of course, the cooperation of both is desirable to build more powerful systems. I did a rapid search, and below are 3 entries for reference:
EDA is more a manifestation of finite state machines going all the way back to Alan Turing. Old_State + Event = Some_Action + New_State. It is the simplest, yet most powerful way to design robust systems. I only wish more people would give it due consideration.
A very old implementation example is I/O interrupts (hardware events) for asynchronous I/O - real time event handling which enabled multitasking operating systems.
Many want to use web services for everything now and at times it is hard to convince people that other messaging schemes and standards are a better fit for some problems.
Various of EP applications are mentioned besides algorithmic trading are: external surveillance by regulators, risk management, auditing, market depth analysis and more.
CEP is intelligent software that is essentially the next step in algorithmic trading – it sifts through market events looking for possible patterns and acts on them. A recent study into banks’ IT spending patterns by consultancy Aite Group, suggested that while budgets as a whole were likely to shrink by 5%, CEP investment remains on an upward trajectory. 36% of respondents to the survey intended to spend more on CEP this year than in 2008.
Adam Honore, senior analyst at Aite and author of the report, says: “We’re still bullish on the potential for CEP across financial services. Once one group successfully deploys a CEP application, word spreads and more technology groups look at CEP to help solve their issues.”
Function answers the question --- what is being done?Technique answers the question -- how something being done?
Application answers the question --- what is the problem being solved?
ExamplesBusiness Activity Monitoring (BAM) is an application type, it solves the problem of controlling the business activities in order to optimize the business, deal with exceptions etc...Business Rules are type of technique --- which can be used to infer facts from other facts or rules (inference rules) , or to determine action when event occurs and condition is satisfied (ECA rules) and more (there are at least half a dozen types of rules, which are techniques to do something).Event Processing is really a set of functions which does what the name indicates -- process events --- processing can be filtering, transforming, enriching, routing, detect patterns, deriving and some more.
R. Bruckner, B. List, and J. Schiefer. DaWaK 2000: Proceedings of the 4th International Conference on Data Warehousing and Knowledge Discovery, page 317--326. London, UK, Springer-Verlag, (2002)
J. Schiefer, S. Rozsnyai, C. Rauscher, and G. Saurer. DEBS '07: Proceedings of the 2007 inaugural international conference on Distributed event-based systems, page 198--205. New York, NY, USA, ACM, (2007)
J. Llinas, C. Bowman, G. Rogova, A. Steinberg, and F. White. In P. Svensson and J. Schubert (Eds.), Proceedings of the Seventh International Conference on Information Fusion (FUSION 2004, page 1218--1230. (2004)