When I first saw an event processing language 10 years ago, as a C++ developer, it opened my eyes to a new development "physics" that made dealing with information in motion simple. Today, it seems like all modern data is in motion: streaming social media data, market data in financial services, and the increasing adoption of the internet of things and sensor-driven systems. Clearly, the technology needed to manage data in motion is different than that needed to manage data at rest; and that's where StreamBase complex event processing, or CEP comes in - its the ultimate platform to monitor, analyze, and act on data that's in motion.
Complex event processing doesn't mean the technology is complicated to use; it's quite the opposite.... the term refers to the ability to identify complex sequences of conditions - as the data moves through an enterprise.
Three CEP capabilities, aggregation, correlation, and temporal analytics, provide the foundation for dealing with data in motion.
For example, in the gift card market, let's say you wanted to detect a common fraud condition, such as: "Tell me when the number of gift card redemptions at any POS machine is above $2,000 in any 1-hour period." - this situation can be easily detected by the CEP's aggregation facility, which continuously calculates metrics across sliding time-bound windows of moving data which is used to understand real-time trends. Continuous aggregation is hard to do with traditional tools; but with CEP, it's built in - simply drag the "aggregation" operator onto your application template, and you're off.
The second key new principle of CEP is Correlation. Correlation means connecting to multiple streams of data in motion and, over a period of time that may be seconds or days, identify that condition A was followed by B, then C. For example, if we connect to streams of gift card redemptions from 1,000 point of sale terminals, CEP would allow me to continuously identify conditions that compare the POS terminals to each other like: "Alert me if the gift card redemptions in one store are above 150% the average of the other stores."
Finally, CEP is designed for temporal analytics, which is the idea of having time as a first-class computing element. Temporal analytics are critical for processing data where the rate and momentum of change matters. For example, in fraud, sudden surges of activity are common indications of fraudulent activity - so detecting "If the number of gift card sale and card activations within 4 hours is greater than the average number of daily activations of that store in the previous week, stop approving activations." Unlike computing models designed to summarize and roll up historical data, CEP is designed to ask and answer these questions on data as it changes.
These three elements: aggregation, correlation, and temporal analytics, set CEP apart from all other computing platforms by helping technologists look at what's happening now, not just what happened in the past more quickly than ever imagined before. Data in motion, and the use of CEP to analyze it, is the future of computing.