There has been a lot of talk recently that compares complex event processing (CEP) and business intelligence (BI) technology, and what they compare to each other. Much of it suggests that CEP is a logical extension of BI into real-time, continuous analysis. For example, Seth Grimes (@sethgrimes) suggested the distinction here and Paul Vincent from TIBCO responded by quoting Tony Baer from Ovum (@TonyBaer) who twitted the following:
"So, in essence, CEP makes BI literally actionable. IOW (in other words) BI is embedded in execution."
These posts triggered a Twitter discussion about how true Tony’s statement was and the breadth of functionality available from CEP platforms like StreamBase. It culminated in Neil Raden’s (@NeilRaden) question:
Are you suggesting that CEP, on its own, is an adequate technology for adaptive control & integrating models/rules from BI and BRMS? (tweet)
Neil’s question crystallized the way CEP is misunderstood by those who are more familiar with traditional data management technology. They try to assess a tool that’s new by comparing it to another tool that’s built for an entirely different purpose.
The core technology of CEP is real-time continuous data transformation. These transformations can include event pattern detection, data enrichment, data analysis, and data reduction. They generally also include support for business rules and for executing code written in traditional programming languages like Java or C++. All this is bundled into a high performance engine designed to run scalably and reliably, and supported by powerful development environments and administrative and visualization tools.
That’s a lot of capabilities that can be brought to bear on many different problems. No wonder some people aren’t entirely sure where to start when attempting to understand what is possible with CEP.
If you know BI, then you try to fit CEP into the BI context. Yet, in fact, StreamBase has never been compared to BI by a prospective customer, despite the fact that “Operational Business Intelligence” has been a buzzword in that area for about four years. If only we could do BI faster, the theory goes, businesses could take advantage of it in the millions of little decisions that impact the business on a day-to-day basis. CEP and BI have similar sounding mantras, yet practically speaking, customers never intermingle the two.
What those immersed in BI don't realize is that the barrier to operational data analysis is not the speed of the data warehouse. As any trader will tell you, the fastest data analysis in the world won’t help you if every trade requires human approval.
In order to take advantage of modern data analysis, you need to stop depending on people to pull the oars, and start making them captains of their own ship. This requires software that can not only analyze a situation, but also take action, and then control that action over time. Workers ("Automation Managers" in the diagram on the left) monitor these systems and cope with exceptions, while the majority of events and decisions are processed automatically, without incident. The role of the user is in guiding real-time activity, not stopping the flow, making a decision, and starting the flow again.
Consider modern trading systems. Years ago, trading floors had hundreds of traders – half of them were making low-level trading decisions based on real-time market data that they watched; orders were placed by typing them into an order management system (OMS); it took 50-100 milliseconds for each trader, once he made a decision, to press the button to buy or sell. Today algorithms designed by quants buy and sell stocks in microseconds, not milliseconds, and one trader, with a real-time application, can guide the behavior that does the work of 1,000 old-school traders, an order of magnitude faster than just five years ago. That’s change in an industry, motivated by a new way of thinking of real-time data processing – complex event processing.
These software applications become the backbone of an automated business. Change is the norm - as the business climate evolves, agility becomes a critical success factor. If the system can be developed and delivered efficiently enough, a wide variety of decisions become suitable for automation. By making millions of small decisions intelligently, massive efficiency gains are possible. Decision support transforms from being a part of the process, to guiding automated processes – a fundamentally new way of thinking that makes old-school BI thinking inapplicable.
But transforming a business to real-time requires not only new technologies, but a new way of thinking as well. It requires a whole set of business practices that only a few companies are now comfortable with. And it requires analysts and software architects who take a wholly different look at how they approach systems. Neil Raden himself talks about this in his book, Smart Enough Systems.
You have to turn operational decision making into a corporate asset you can measure, control, and improve.
Wall Street reached the tipping point about five years ago, and has been creating not only the software systems but also the expertise necessary to run a truly real time business. CEP systems on Wall Street not only monitor markets and customers for opportunities, they also take actions and continuously monitor those actions for successful implementation.
The answer to Neil’s question is that yes, if a business is ready to adaptively control processes with models, rules, and analytics, then CEP is the ideal technology for integrating analysis with action in real time.
(Richard Tibbetts is the Chief Technology Officer of StreamBase. Follow him on twitter as @tibbetts)