Is the time right for Progress Software to be bought?

In the course of my ongoing analysis of software infrastructure vendors I was intrigued by the recent earnings release from Progress Software…

…and it caused me to dig a bit deeper. Basically, Progress is holding its revenue stream although not growing it, and I guess in today’s environment that is OK. But when the performance of the company over the last few years is considered, a different picture starts to build up.

Basically, Progress made a lot of money from its OpenEdge database product, and this business is still providing a rich ‘cash-cow’ revenue stream. However, not only has this stagnated but it is starting to decay, with Q109 showing a sharp drop. Admittedly this is probably in part due to currency movements, but the trend is clear – this is not a growing business ans the writing is on the wall, at least in the longer term. Progress knows this, and so over the past few years it has been on the acquisition trail, trying desperately to find a new business that can grow sufficiently to become the new OpenEdge. It has tried the area of Data, with its DataDirect division growing through acquisition, but this business has reached a steady state with little or no growth. It tried the area of messaging, being the company that brought the term ESB (Enterprise Service Bus) to the world through its SONIC line of business, but having got a great mindshare and market position it lost focus and this business is now fatally damaged, with others such as IBM, Oracle andMicrosoft taking up the mantle. Recently it acquired the APAMA complex event processing business, Actional (SOA management) and IONA (a datedintegration business based in Ireland). It has since found some success with the excellent APAMA offering in the heartland of financial market data processing, but has struggled to replicate this success in other industries and use cases. Actional has also had some success but it is immutably tied to the SOA star which is having its own problems. And IONA, similarly to Progress, has a nice legacy integration business based around Orbix but has failed utterly over the years to create anything else worthwhile.

The result is that although the IONA purchase has increased revenues in the Progress ‘integration infrastructure’ business unit, this is likely to be a one-off improvement and once again Progress is going to be stuck with an aging cash-cow and no clear rising star to take over responsibility for driving growth.

This might seem a recipe for Progress itself to be acquired. Up to now, this has been unattractive due to the share price, but in thecurrent climate the acquisition looks a lot more interesting. My view is that there are probably two strong candidate acquirers for Progress:

  • Companies looking for attractive maintenance businesses where profit can be maximized by cutting expenses and taking the money until the product line sunsets
  • Companies not currently in the integration space but wanting to get into this lucrative area and looking for a ready-made product set (perhaps to underpin a professional services business)

Who knows what will happen in the current turmoil? I may be way off the mark, but if I was a company fitting either of these two categories, and I had the money, I think now would be a good time to strike. After so many false dawns, I suspect the Progress management team might not resist too hard….


    The data deluge: new surf-board required?

    Another report (this time from Tower Group) is highlighting the likely increase in data volumes that enterprises will need to handle over the next couple of years – a 900% increase in financial market data by 2012.

    Of course, increasing amounts of data flowing around the enterprise is hardly new (in the mid 1990s, I remember being told in hushed tones at one of the baby bells that their databases were several terabytes in size).  Each time enterprises have been faced with such an increase, decisions have been made about what needs to be done:  Is the focus on integration, storage or analysis and can we use existing technologies or do we need/want new technologies?

    In the case of The Tower Group  report, the focus is on financial services and in particular the increase in market data handling requirements caused by new regulations (the EU’s MiFiD and US’s Reg NMS – both intended to create fairer markets).   Specifically, there is a regulatory requirement to store the data (in a manner which can be used as evidence if there is ever a compliance issue).  As these regulations relate to best price execution, this in itself will be onerous because every single published quote available when each trade is executed must be stored as well as all other relevant data associated with the trade.

    Similar scenarios (where there is a flood of data or a potential one) exist in many industries – from the online games with millions of players generating massive amounts of data to security monitoring devices for government agencies to RFID tags.  However, it is too easy to react by announcing that the data deluge is coming and some new technological surfboard is needed.

    For a start, in many cases the data may have little potential value and storage may not even be needed.  But of course this will not always be the case.  To quote from the Bob’s guide coverage of the press release:

    “Regulatory compliance will pave the way for firms to completely automate the trading process,” added Price. “Once all the players have done so, the winner in the hunt for liquidity will be whoever can process the data the fastest.”

    To put it another way, in the case of market data there may be business justification for doing more than storage for forensic reasons either in the realm of integration or analysis of the data streams.   In my other examples there will also be opportunities – again in the realm of integration or analysis.
    The second hurdle is whether new technology is needed.

    Simply saying that there will be an increase in data and a need to do something with it is not enough to justify new technology as old technologies may continue to scale.  Going back to my baby bell experience, this was around the time when Object databases were being touted as the relational db killer – as only Object databases could possibly handle that size of database.

    In the case of the “data deluge”, the real-time processing of streams (when the value of the analysis is high enough) is certainly one area where a new approach seems needed – and is an area targeted by the likes of StreamBase and Progress‘ Apama products.  For analysing the increasing massive data warehouses and for integration, I think the jury is still out – although the increasing complexity within the data may throw a spanner in the works.


    Complex Event Processing (CEP) and SOA: Sitting side by side in finance

    A perennial SOA discussion is whether it is a universal architecture or whether there will be complementary or even competing architectures.

    This article in Water’s Online focuses on this issue in the context of where SOA is being used and more importantly not being used.  While the tone of the article is probably negative to SOA, it is interesting to notice how established SOA is in the large investment banks with quotes featured from both HSBC and Bear Stearns.

    To briefly cover the argument:  SOA has obvious overheads if messages are encoded in XML and used synchronously.  This makes it unusable in some of the trading and market data applications where the volumes of data are huge and speed is everything.  In this space, Complex Event Processing (CEP) tools such as those from Progress Apama and Streambase are clearly more suitable than SOA based on business processes and synchronous processing models.  To quote a well balanced view from a senior IT person at Bear Stearns:

    “The more glamorous parts of the trading system typically involve super high-performance networks, data-distribution models, and technologies such as CEP that are ‘in line’ with information coming from and going to the markets,” says Moschetti [chief architecture officer at Bear Stearns]. “But other aspects of the system, such as entitlements, configuration, loading of baskets, lookups on reference data, and so on, can use what the industry conceives as SOA—that is, Web services, XML over message busses, and so on,” he adds.

    Therefore, it isn’t a matter of one approach dominating the other is some strange Darwinian struggle.  For industries where the processing requirements include real time streams (and in particular where the stream volume and complexity exceeds what ‘static’ approaches based on variations on data warehousing can handle), CEP will have a bright future.  That means at the moment: front office in financial services, some government security applications and Massive Multiplayer Online Gaming (as shown by Streambase’s recent announcement).   Apart from that CEP has an important role to play in control and monitoring within the SOA layer as Steve has previously blogged about.