(Mark Palmer is the CEO of StreamBase Systems - follow him at twitter.com/mrkwpalmer)
The information technology world is buzzing about event processing (or complex event processing — CEP) as one of the hottest areas in software innovation. For example, Time Magazine declared event processing to be one of eight “Innovations that will change your life,” the World Economic Forum at Davos named event processing as one of the 26 top innovations in the world, Gartner analyst Roy Schulte calls event processing the key to “agile companies,” and event processing is all over top trends lists for 2010, including Entrepreneur / VentureBeat, CIO Magazine, and well-respected IT industry analyst Brenda Michelson.
Before getting to predictions for the future, let’s step back and look at what the hype is all about.
What’s the problem? Why event processing? What is event processing?
Let’s start with the problem: recently, and for the foreseeable future, the physics of data has fundamentally changed. The first 50 years of data processing managed data at rest; the next 50 years of data processing will manage information in motion. Sensor data, digitized stock market feeds, click-stream data from online commerce, and events from real-time social media activity — all this information is in motion, streaming through the global high-speed network of computers (watch this 1-minute video tutorial on event processing for an overview).
Not only is today’s data in motion, there’s more of it. In two minutes, the data stream from Twitter generates the amount of text found in the entire works of William Shakespeare. A single feed of U.S. stock market data generates the same volume in under three seconds.
Some industries, like financial services, have quietly figured out how to harness the power of this streaming data — for example, 70% of the trading done in the U.S. stock market every day depends on streaming data, and much of it uses event processing technology. The federal government uses event processing broadly, and uses of event processing outside of these markets are growing.
The early impact of event processing has been profound because it presents an opportunity to innovate by applying a new computing physics for moving data. In the late 90‘s the inventors of the relational database (Dr. Michael Stonebraker and others) had a simple idea: to process moving data with a language (SQL) borrowed from the lingua franca of databases, but which implements its processing in a way that eliminates the latency and restrictions of storing data to disk. The objective was to help applications find needles before they go into the haystack. Metaphorically speaking, a business that uses event processing is like the bear in the photo above: the bear can know — in real-time — where the salmon is. If the bear had to wait for a database, he’d be finding out next week where the salmon was.
It’s a simple idea, but not easy to implement. But ten years on and driven by pioneers in financial services, event processing now processes quadrillions of dollars a year on today’s version of the virtual, electronic Wall Street. The natural gestation period of ten years is ending, the technology is proven, and it’s fun to see it proclaimed to be the next big thing.
Predictions for the Evolution of Event Processing
OK, so lots of smart people think event processing is hot and important. But history is littered with dead technologies that were once proclaimed as “the next big thing.” What does the future of event processing hold? Since a streambase sounds as important as a database, will Oracle own this space? What role will open source play? Will the existing IT infrastructure providers be the gorillas or the monkeys of the event processing era?
As the CEO of what most consider to be the leading pure play event processing company, I'm fortunate to see signs that foreshadow the emergence of a great industry. So here are my predictions for the future of event processing, beginning with the dramatic change in the nature of event processing customer deployments that has occurred in just the past eighteen months.
Prediction #1: 100+ Event Processing Applications Per Firm Milestone will be Passed. Event processing turned an important corner in 2009; it went from an innovative enigma to a key architectural elixir in some of the world’s most innovative companies. Previously, early adopters used event processing in applications like algorithmic trading, fraud detection, location-based services, and homeland security. These applications were an important first steps, and a sign of things to come.
But the turning point in 2009 was news that some of the world's biggest companies went big with event processing. The watershed event was when the CME Group, one of the largest financial exchanges in the world, which trades $1.3 quadrillion (1,300,000,000,000) in assets a year, adopted event processing enterprise-wide. And other big names made big announcements about event processing like RBC, ConvergEx, CMC Markets, and City Index. These firms aren't just building a single application with event processing; they made broad, multi-application, enterprise-wide commitments to a new technological backbone.
Beginning in 2010, watch for the next step — for individual firms to deploy over a hundred event processing applications throughout their new IT foundation — a sign that event processing is coming of age as the backbone for the real-time enterprise.
Prediction #2: The "Deluge of Real-Time Data" Will Not Drive Event Processing Adoption
Yes, it’s true that three seconds of stock market data can generate more data than Shakespeare did in his entire life’s work. Yes, the best event processing platforms can process that volume with latency measured in microseconds (a microsecond is a 1/millionth of a second; it takes light 5.4 microseconds to travel a mile in a vacuum). Yes, many times, speed is the main selection criteria among CEP products (for example, trading specialist CMC Markets processed 3,000 events with one vendor, switched, and currently processes 17,000 events a second). Yes, StreamBase is known for its blinding speed we make it better with every release.
But will speed be the most dominant reason why event processing is adopted? No.
Still, headlines proclaim that the raison d’etre of event processing is to "tame the data deluge," but these stories miss the point. No, most often, event processing will be adopted for different reasons.
Prediction #3: Cognitive Physics Will Be as Important as Computing Physics
As Clayton Christensen suggests in The Innovator’s Dilemma, simplicity is always a disruptive innovation, Today this is more true than ever. The half-life of software is weeks and days, not years and decades. In computing’s first 50 years of technology, frictionless electronic distribution wasn’t the norm; now, with high-speed networks, cloud computing, and software as a service, customers expect new features immediately, and because switching costs are so much lower, they'll leave — or be acquired — more quickly than before.
If you want to do a job faster and better, you need the right tools. For knowledge-based work, the right tool aligns the "cognitive physics" — that is, good tool choice aligns the way of thinking about a problem with the way the solution is built. When cognitive physics matches implementation physics, barriers to implementation fall. Event processing tools can break down these barriers in an organization that has moving data. For example, PhaseCapital, a Boston-based hedge fund, built their entire trading system with event processing in four months — CEO Eric Pritchett estimates it would have taken them four years with traditional programming tools. But more importantly, PhaseCapital changes their system every day. They test and refine their existing ideas. Invent new ones.
So the future of innovation with event processing will be governed by its ability to reduce the friction between the way you think about a problem and the way you solve a problem. For an event-based business, ideas turn into applications more quickly, improvements are made more quickly, and problems are fixed more quickly. And this kind of speed — the speed to find, implement, and get a new idea to market — is a key element of a truly "agile" company.
Prediction #4: Quantitative Thinking will Trump Traditional Thinking
Earlier this month, the Bureau of Labor Statistics forecasted the 2 of the top 3 biggest growth areas of labor in the U.S. to be tech-driven: a whopping 1.5 million jobs over the next 10 years (coverage from the Huffington Post). Yet the vast majority of Fortune 100 CEOs are still technology slackers(as measured by participation in social media. Hardly a perfect measure, but it's scary).
Modern tech-driven innovation will be led by a new kind of business leader who has a keen sense of technology, and barriers between C-Level executives and IT must fall. Two industries already illustrate the trend that will prevail in all industries over the next 50 years: capital markets and baseball.
Yup, that's right, let's start with baseball. In the past 20 years, the organizational principles of baseball have changed from gut feel and folklore to robust statistical analysis of every single element of the game (then applying gut feel). Bill James began his statistical analysis of what made teams win in 1977, and it was brought to the popular consciousness by the best seller Moneyball in 2003. The Boston Red Sox and their fabled 86-year losing streak was not just broken after they adopted a quantitative approach to baseball, they became the most successful team of the decade. And its no accident that the main owner of the Red Sox is financial guru John Henry, which leads us to the other exemplar for future business: financial services.
Financial services is already 20 years into this transition. In the past 10 years, mathematical, real-time trading has risen from less than 5% to over 70% of trading activity on U.S. stocks alone, and the rest of the industry is becoming electronic. 20 years ago, in my first job outof college, I worked on a trading floor. The traders were MBAs from Harvard and Stanford - brilliant, personable, and competitive. The ultimate knowledge workers. Now, the top traders are mathematicians. They are competitive but in a more cerebral way. 20 years ago, trading floors were loud and boisterous - traders shouted down phones, and at each other; now, they are intense mission control centers. Shouting has been replaced by the clicking of keys on keyboards. Analytical acumen is now just as important as intense physical passion.
Over the next 50 years, other industries will evolve in much the same way: the winners of the modern economy will adopt quantitative strategies as much as traditional business strategies.
Event processing facilitates quantitative thinking with its modern, visual development tools. These tools help facilitate analytical thinking, and also lower the barrier between business concept and implementation (see the related prediction that cognitive physics will be as important as computing physics)
For example, a trading strategist can describe a strategy like: “If the price of IBM goes up more than 2% in any 10 minute time window, but HP doesn’t, then buy HP, but only if the spread holds.” These words, this strategy, this business logic, can be expressed directly with visual tools by business analysts and IT. And the modern business leader will need to communicate fluently in this language - they won't have to code it, but they'll have to understand the math of their business deeply, and in detail.
By lowering the barrier to express quantitative business strategies, innovation cycles accelerate.
But with only 2 of the Fortune 100 CEOs on Twitter, the opportunity to out-think your competition is wide open. So the winners of tomorrow will be led by those who get comfortable with technology today.
Prediction #5: Software Stacks will Continue to Miss the Mark
Rupert Murdoch said: “The world is changing fast. Big will not beat small anymore. It will be the fast beating the slow.” This is certainly true in the technology space - big companies can innovate, if they stay small.
In the world of enterprise software, Oracle, IBM, Microsoft are big, and they don't act small. In fact, over the last decade, they have acquired their way to an ever more complex portfolio of products - they act even bigger than they actually are. Moreover, these big product stacks are designed for traditional computing - computing that looks at what happened in the past, not what's happening in real-time.
Recently, these big vendors became aware of event processing. What’s the natural thing to do? Buy some technology and slap it on the stack! Even Progress Software, who acquired CEP pioneer Apama, lost its focus to chase the big stack companies (for example, this week they acquired a small business process management technology (Savvion) as the centerpiece of yet-another-stack-strategy).
These stacks are to event processing as the record industry is to iTunes - a different industry for a different era. Customers don't want big stacks; they want tools that solve their problems.
The truth lies in the results, as measured by customers and the specific business value they achieve. In the past year, the endorsements have gone to small, focused event processing companies, not the big stack vendors.
In the past year, StreamBase customers aren't anonymous; they are serious corporations. They are enterprise-wide endorsements. They manage quadrillions of dollars of business every year. And they furnish technical details. The partial list of public endorsements include: CME Group, RBC, BNY ConvergEx,City Index, CMC Markets (who switched from a stack vendor (Progress Apama)), Curex, PhaseCapital,BlueCrest and Kairos. And these are public endorsements in one of the most secretive industries there is: the capital markets.
Credible customer stories don't lie, and they show that in CEP, small is beating big. Watch for this trend to continue as the nimble, pure-play players continue their astounding growth and Hollywood endorsements, and run circles around their less nimble competitors pushing big stacks of software designed for yesterday's applications.
Prediction #6: A New Event-Based Software Stack will Emerge
Just because the traditional stack vendors aren't players in event processing, that doesn't mean there isn't a complete software solution required for event processing. The new event processing stack has elements that the big stack vendors don't want you to know about (because they can't sell it to you).
The elements of any software infrastructure are:
- Connectivity
- Business logic
- Presentation layer (GUI)
- Database
For event-based systems, connectivity is the most important element. It includes middleware like TIBCO, but more importantly includes recent entrants such as Solace and Tervela, Google Chrome (namely websockets), and standards like AMQP and DDS. AMQP is very attractive as a standards based messaging wire protocol. Solace, Tervela, RedHat, Apache (QPID), RabbitMQ and various other messaging providers/vendors are working hard on AMQP interoperability. CEP is great at analytics. AMQP is great at distribution. CEP over AMQP simplifies integration. Industry specific connectivity is also critical. For example, in the capital markets, connectivity to FIX, Reuters, Wombat, FX venues, and so on - this connectivity helps speed integration and bring CEP online.
Business logic is the domain of event processing like StreamBase.
At the database level, relational databases still play a role to store history, but just as important, if not more so, are column-oriented databases like Vertica.
Finally, some of the most innovative presentation layer technology for time-series data is coming from the Eclipse open source community.
What’s the pattern in all this stack innovation? None of it comes from the big companies - it’s coming from small innovative startups that are breaking down the physics of traditional computing as we form a new stack for the next 50 years of innovation.
Prediction #7: Stream-Based Platforms will Continue to Lead a Siege Against the $15B Data-Base Market
It’s flattering that Entrepreneur / VentureBeat chose StreamBase as one of the technologies that will continue to lead a siege against the $15 billlion database market, but the truth is that there are multiple technologies in the real-time computing stack required to make this happen, including high speed connectivity, column-oriented data stores and CEP.
Event-based platforms are about real-time computing; databases, business intelligence, and data warehouses are about looking at what happened in the past. By combining the two to operate on streaming data, you create an optimal platform for the real-time enterprise.
So watch for the seige on this market to continue as data management techniques help corporations find needles before they go into the haystack of corporate databases.
Prediction #8: Open Source CEP Won't Impact the Market, But Open Source Will
Although event processing platforms is robust and proven, there is still constant innovation in the space. Although the efforts of open source projects Esper and Cayuga are admirable they both miss the importance of simplicity through graphical event languages and powerful development tools. They also aren't proven for mission critical usage, or optimized for the high performance, low latency demands.
However, open source is helping spur the growth of event processing in important ways:
- The rise of Eclipse provides instant access to a vast community of development tool plug-ins for event processing development.
- Connectivity is commoditizing with developments in QuickFix, AMQP implementations like RabbitMQ, and DDS implementations like OpenSplice.
- Events-in-the-cloud is becoming easier with open source cloud computing and web based messaging protocols.
The rise of open source for development, connectivity, and cloud computing is also why software stacks will continue to miss the mark with their proprietary stacks of software.
So as the new event-based software stack begins to emerge, event processing providers that embrace this chance will continue to grow as the enterprise platform of choice.
Prediction #9: Event processing will Yield a Great New Software Powerhouse
The ninth and final prediction for event processing is that the processing of moving information will yield a great independent software company.
While it's nice that Time Magazine declared event processing is a technology that will "change your life," and the World Economic Forum at Davos named event processing as one of the 26 top innovations in the world, the previous 8 predictions explain why event processing will prove fertile ground for a great software sector:
- The business case - to enable the rapid construction of real-time applications - has already impacted two of our most critical markets: the capital markets and the intelligence agencies.
- The solution requires a fundamentally new computing physics where traditional static computing technologies don't fit,
- The technology is proven, with major companies already using CEP enterprise-wide
- The revolutionary simplicity of event processing is, as Clayton Christensen suggests, "always disruptive," and therefore facilitates disruptive innovation
- This innovation has become even more critical in an era where competitive advantage is becoming more quantitative.
- The market for data management is huge and the big software stack vendors continue to miss the mark with their technology offerings.
- The possibility of an entirely new software stack is upon us - open source is a big part of it, and innovation in the field should continue for the next 10 years.
Put all of the conditions together, and it's ripe territory for the emergence of a new software powerhouse that disrupts the old guard.