Event-driven systems are new to many firms. For many, automating existing manual processing, using event-driven techniques, properly simulating real-time analytics systems is a new experience.
Fortunately, some industries have been on the forefront of designing for real-time. Namely, banks in the capital markets trading arena have been automating and building event-based applciations for years.
So on a recent customer tour with our customers, we've been asking them for their best lessons learned and the business impact the systems they've built have made.
One IT executive told me "CEP allows me to build event-driven apps in 2-4 weeks, rather than 6-9 months with an app server and database - so I can say "yes" to more requests from their business clients, which helps us innovate more quickly. We keep the great ideas, and discard the ones that don't work."
The most consistently successful firms that moved to real-time on Wall Street started with a small, critical application, learned lessons from its development, then applied those lessons learned to other projects and groups. Over time, they developed a center of excellence with firm-wide sucess in real-time application development.
Systems that use primarily historical have trained many technologists to capacity plan their systems for the aggreate amount of data they manage. Real-time systems are different, and should be planned for spikes in volume, rather than only aggreate volumes. This best practice explores three approaches: burst behavior, the use of conflation, and computational reduction, that helps applications gracefully manage spikes and deliver correct and timely response to users and automated action.
Keep coming back throughout 2013 as we continue to add to our library of real-time system best practices!
Numerical computing environments are often used for algorithm development, data analysis, visualization, matrix manipulations, and statistical analysis. In finance, quants use it to do time-series analysis, portfolio optimization, asset liability modeling, quantitative risk modeling, and derivative pricing. This best practice describes how to does this computing in real-time, using MATLAB with StreamBase CEP together.
The prospect of automating a business often sparks a debate about what is more effective, man or the machine. It turns out that the debate presents a false choice, and effective automation lies in the proper balance between the two.
This article explores 5 steps to balance man and machine, including the infusion of more data scientists in executive positions, the application of real-time risk management and surveillance, the infusion of historical and real-time analytics in one place, and more.