Progress Software acquires Savvion

handshakeSo Progress Software has bought yet another software company; this time a BPM vendor, Savvion. But is this the right move for Progress?

Progress Software has spent most of its life growing through acquisition, making use of the piles of cash generated by its legacy mid-range database product to find new areas of growth. After all, the legacy business may be highly profitable, but its returms are dwindling by the year and Porgress desperately needs something else to shore up its balance sheet. Unfortunately its acquisitions have had a bit of a patchy record of success. Perhaps it will be different this time.

Savvion is a credible BPM (Business Process Management) software provider, and 2009 was a bumper year for BPM sales. Specialist companies like Pegasystems and Lombardi showed huge growth rates, bucking the downward trend triggered across many technology sectors by the economic upheaval. On top of this, Progress has been trying to establish itself as a viable SOA (Service Oriented Architecture) and business integration vendor ever since it launched the Sonic ESB in the early years of the last decade, and BPM was a glaring hole in its portfolio. For these reasons, it is easy to see why Savvion would seem a good fit.

There seem to be two problems for Progress, however. Firstly, BPM is now rarely a solution bought in its own right – hence the rapid consolidation of the BPM market with Pegasystems more or less the only major oure-play BPM left standing following IBM’s acquisition of Lombardi. Instead, BPM is deployed more and more as part of a business transformation strategy involving components such as SOA, application and data integration, business rules, business monitoring and business events management.  Secondly, the gorillas in the space are now IBM, Oracle and SAP. These companies all offer a full suite of products and more importantly services based around BPM and the rest of the modern infrastructure stack. Companies such as Software AG, TIBCO and Axway form a credible second tier, too.

In previous acquisitions, Progress has treated each acqusition as purely software products. This is not surprising, since selling databases is more about selling products than selling solutions. However, it is this factor that has been at the root of the patchy performance of Progress acquisitions. For instance, the Data Direct division of Progress, where it placed a number of acquisitions in the data space, has fared reasonably well. This is because it is more of a product business. However its attempts in areas such as ESBs and SOA governance have suffered due to a seeming reluctance to embrace a more industry-specific, services-based solution model.

With its acqusition of Savvion, Progress once again has the chance to try to show the market that it has learnt from its mistakes. BPM is absolutely an area where companies need to be offered solutions – products together with services and guidance to develop effective and affordable business solutions. It will be hard enough for Progress to cut a share of the BPM pie with all the big players involved, but it does have one outstanding advantage; it has a strong and accessible customer base in the mid-range market where the larger companies struggle. However, if it fails to take on board the need to hire industryvertical skills and solution-based field and service professionals then this acquisition could prove to be yet another lost opportunity.

Steve

BAM vs BI

cognosLustratus recently received a comment to a post I wrote a couple of years back on IBM’s acquisition of Cognos.

The comment asked whether this meant IBM now had two BAM tools, COGNOS and WebSphere Business Monitor, and I thought that rather than respond to the original and now very old post I would create a new post, since this question actually crystallizes a very contemporary confusion over the roles of BAM and BI.

BI (Business Intelligence) is the term that originally emerged to describe the market for tools to find out more about data. Typically, BI tools aggregated data and provided ‘slice and dice’ services to look at data in different ways, correlating it and detecting interesting patterns. So, as a simple example, examining sales information allowed Amazon to identify trends in related customer buying – hence when you buy a DVD Amazon can helpfully pop up and point out that ‘people who bought this DVD also bought….’ to try to accelerate sales based on buying patterns. The key characteristicof BI was that it was typically a static activity, usually carried out against historical data. In modern times, however, it is more and more related to the analysis of any data, whether static or dynamic. COGNOS was a leading supplier of BI tools.

BAM (Business Activity Monitoring) was the term coined to describe tools that were primarily focused on analysing behaviour of run-time applications rather than static data. An example here might be monitoring a loans application in order to see how often loan requests are having to be queued up for supervisor approval rather than executed under the authority of the loans advisor. A trigger could then be defined to highlight excessive involvement of the supervisor which might indicate other problems, such as inadequately trained loans advisors.

So, to reinforce this distinction, an executive view of BAM might be a dashboard display that shows the oeprational performance of the business in real time, with a colour-coded scheme to point out areas of concern. In contrast, the BI view might be of a set of charts in a presentation that describe business or buyer trends based on analysis of company performance over the last three months, enabling new initiatives to support competitiveness or access new business opportunities.

Over time, these two markets have tended to overlap. After all, both markets involve the steps of gathering information, and then analysing it. While gathering info will be completely different in the two cases (looking at data files vs monitoring business application and process execution) the analysis may well involve the same procedures, and hence BI analysis technology may well be used to enhance BAM offerings. However, there is a more pressing reason for linking the two areas. More and more companies are looking to ‘BAM’ as a way to optimize and enhance operational execution, and it is foolish to limit its scope to just looking at application performance. The user really wants to take into account all information in order to reflect corporate performance and identify opportunities. This covers both real-time execution AND what is happening to the data files.

However, because of the different smarts required for these two areas, it is unlikely that products are going to merge – in other words IBM is unlikely to replace COGNOS and WebSphere Business Monitor with a single product. This would make little sense. Instead companies are likely to improve the linkage and integration between these two distinct areas of technology.

Steve

IBM gets Cognos to fill the gaps

IBM has been on quite an acquisition spree of late, but the latest is perhaps the most predictable and potentially powerful.

IBM is buying Cognos, the business intelligence and performance management vendor, for $5B. This has been on the cards for some time, and increased in likelihood when SAP acquired Business Objects recently – Cognos and Business Objects were acknowledged as the two leading players in the BI space.

This looks to be a great deal for both companies, and users should be pleased. With 25,000 customers spread across some of the largest companies in the world, Cognos is a well established BI player, and shares many customers with IBM. On top of this, with the SAP acquisition of Business Objects, Cognos was the last big pure-play BI vendor and a major target for acquisition, so its users must have had concerns over potential new masters – however they are likely to be very comfortable with IBM since it does not really have competing products and will therefore keep the technology alive and moving forward. From an IBM point of view, it will be particularly keen to get its hands on the 1000+ BI-experienced R&D team in Cognos. as well as the 2000 or so customer-facing field force.

The key here is that the two companies mesh really well together. IBM’s Information on Demand initiatives have made every effort to ensure that data is accessible wherever and whenever it is needed, while Cognos concentrates on interpreting the data and generating business value from it. Combining the two promises to deliver even more accurate, timely and understandable information to support executive decision-making and business-driven data mining initiatives.

This also fills some gaps for IBM in its service-oriented architecture (SOA) strategy. At the moment, one of the few weak spots for IBM is its lack of an industry-leading BAM (Business Activity Monitoring) initiative. That is, IBM SOA can make programs visible in terms of business services, improving the propensity for getting a clearer understanding of business activity in IT operations, but it was not particularly good at interpreting the information in BAM terms. Now it will be able to deliver one of the best BAM solutions in the marketplace, making it possible not just to streamline and automate processes but also to continually improve their effectiveness.

If there is a challenge for IBM in absorbing Cognos, it is in the area of organization. IBM has made the decision to keep Cognos in its own business unit – BI and performance management. However, this unit lies within the IBM Information Management division. While this makes sense at one level, in that BI is very related to information, the performance management part is closely linked to the IBM SOA initiative driven from another division. It will be important for IBM to manage this carefully, but it is not the first time it has had to deal with this sort of cross-divisional issue – for example, it had to handle this when Tivoli, the management software company, was acquired.

All in all, this acquisition looks to be good for just about everyone – IBM users, Cognos users, IBM and Cognos companies and employees….but perhaps not for the competition!

Steve

Is your legacy integration just a veneer?

Summer seems to be a time for reading through that huge pile of interesting articles and magazines that you can only find time to look at on vacation.

On flicking through my own mountain of stuff, I came across the Q2 edition of Financial-i, a magazine targeted at the Financial Services industry. One point to jump out at me was a comment from Paul Joynt of Nordea, a Scandinavian finance house.  Paul was pointing out that SOA does not necessarily solve the problems associated with legacy integration.

The article, ‘SOA – is it worth the effort’, is available from the Financial-i site if you register, but Paul comments that covering a legacy system with a wrapper “so it looks like what you want” still leaves problems with the next level of change, because “it’s only a veneer”.

I think Paul has hit on an important point here. Different vendors in the SOA space have different approaches to addressing the problem of integrating legacy systems. Some will simply ‘hand off’ the request for legacy information to a tool from the legacy supplier – in the case of IBM mainframes this might mean using WebSphereMQ as the bridge, for instance. Others might approach the problem in some sort of screen-scraping or other interface simulation approach, where the legacy application is fooled into thinking it is running in its normal mode of operations. Yet more may generate code-based wrappers for each individual need, to be executed whenever a particular service is required.

To me, this all sounds too much like veneer in Paul’s terms. Although this might address immediate needs, future changes will continue to generate substantial additional work and the generation of more and more ‘special-case’ code and wrappers.

Instead, the best of breed legacy integration solution should embrace SOA and integration rather than try to fool it with wrappers designed to seal off the legacy world from the outside. Legacy integration should be about making the legacy system a full and active participant in the service definition and execution. For example, orchestration should be possible both outside and within the legacy environment. Services should be built with full participation from both sides. By taking this approach, the best of breed legacy integration tool will ensure that future changes will become easier, quicker, cheaper and more reliable.

For more information on the whole subject of legacy integration, specifically in the case of mainframe systems, Lustratus offers a free paper on the subject.

Steve

Software AG and webMethods – part II

Previously I have blogged on the SoftwareAG acquisition of webMethods, and what it might mean.

Lustratus also produced a paper on the subject, here. I thought it was time for an update. now things are becoming clearer.

I congratulate SoftwareAG on listening to my comments! Well, at least partially….the company has brought together pieces from both the webMethods and SoftwareAG sides in the area of SOA, and has come up with its suite, offering an ESB (well, actually more than one), adapters, BPM, BAM and legacy and user interface integration support into an SOA suite, called – wait for it – webMethods SOA Suite!

The company has wisely decided to leverage the strength of the webMethods brand name, both in the integration/SOA space and also geographically in the US. My only criticism is that in fact I have been slightly misleading. In fact, I believe the full name is ‘Software AG webMethods SOA Suite’. I just hope leaving the Software AG name so prominently does not backfire.

It seems that the suite will be made up of webMethods BAM and BPM, together with a combination of SoftwareAG and webMethods integration infrastructure products. For example, Software AG’s CentraSite deals with registry/repository needs, and use is also made of Software AG’s connectivity power. So, for example, webMethods EntireX deals with turning legacy code into SOA services. The ESBs are a bit confusing – there appear to be 3. One is a regular ESB, one is an ESB+ (basically the webMethods integration backbone) and one is a lightweight integration tool aimed at partner network needs.

So how is the merger going down? Well, it seems that at least some webmethods customers are glad to see the combination. Apparently this is because webMethods was actually a mature start-up – that is, innovative but not necessarily strong in internal development/delivery/maintenance processes, whereas Software AG has a reputation for being a solid, experienced and mature software brand. So presumably webMethods customers hope Software AG will bring some additional discipline to product delivery and support.

Anyway, the proof of the pudding will be seeing whether the combination gains traction. I think it has a chance, although if only they had dropped the SoftwareAG brand from the suite altogether….

Steve

EDA vs SOA

I have been involved in some recent research into event-driven architecture (EDA) and its relationship to service-oriented architecture (SOA), as a result of confusion abounding over the two concepts.

Some people seem to think EDA = SOA 2.0. Others that they are already doing EDA in their SOA implementations because they are using asynchronous communications such as a JMS or IBM WebSphereMQ. This confusion is exacerbated by vendors with their own agendas – TIBCO has been banging the EDA drum for ages as the preferred way to go to solve integration problems, IBM has just held a massive event to drive its own SOA agenda, Oracle seem to be trying to straddle the two approaches, and complex event processing (CEP) vendors like Progress have their own stories about EDA.

My own analysis, together with Dr. Ronan Bradley, also of Lustratus, has concluded that as is so often the case, the problem comes down to confusion over terminology. EDA is an architecture, just like SOA. It is a way of running operations, and before anyone starts to ask whether I am on the side of SOA or EDA, the two can happily coexist. But the confusion arises when people start to use EDA as a term to refer to particular implementations rather than to the architecture itself.

In fact, we identified 3 major ways that EDA relates to SOA, and concluded that EDA may have a key role to play as SOA matures – to deal with the increasing management complexity of widescale SOA deployments through a ‘management by exception’ approach.

For those interested in reading the detailed research, Lustratus has published an Insight on the subject, available at the Lustratus site.

Steve

Tibco well-placed for the future of SOA with Spotfire acquisition

Tibco has announced that it will spend $195m to buy Spotfire, a business intelligence company.

This represents a major investment in upgrading its ability to meet a key requirement that arise in large scale mature SOA deployments: The ability to monitor and control the increasing quantity of business relevant information that flows through the network as SOA becomes more pervasive within an enterprise.

In the pre-SOA world, most business processes are completed within a single application.  Therefore, in order to get a complete picture of the business processes executing it was sufficient to monitor the applications.  In a SOA deployment, this is no longer the case as many business processes are split between applications and much of the information resides in the SOA network and not in applications.  Therefore, to re-establish control from both a business and operations management perspective, it is essential to track the messages and processes flows through the network.

This acquisition gives Tibco a major boost of its Business Activity Monitoring capability (it already had Business Factor which presumably will now be retired) which is at least half the solution: allowing the business and IT operations managers to display and analyse the information.  The other half is identifying the information to deliver and requires what is called Complex Event Processing: the ability to identify the unusual events or combinations of events that are of interest.

CEP is needed because while there is a potentially huge quantity of information flowing around the network (corresponding to each service invocation and response), from a monitoring and control point of view most of this information is of little interest as it relates to the normal routine operation of business processes.  Therefore, the focus is not simply the bulk shipping of all process and message information to a data warehouse but rather the intelligent identification and management of anomalous behaviours (perhaps an order which is so large that it requires special approval or a problem in inventory which is stopping the completion of order processes).

Tibco is also well placed to provide CEP, as it announced last week the second version of its CEP product called BusinessEvents – and I expect to see announcements about how they intend to plug together Spotfire and BusinessEvents.

Ronan