Process and Criteria for Evaluating Services-Based Integration Technologies
Anna Liu and Ian Gorton
Integration Challenges
July 2005
Applies to:
Enterprise architecture
Enterprise application integration technologies
Summary: This article describes a proven approach to assist architects with evaluating enterprise application integration technologies. In particular, we focus our discussion on evaluating integration technologies, for implementing services-based integration. (13 printed pages)
Contents
Introduction
SOA for Integration
Technologies for Implementing Services-Based Integration
The i-MATE Process
The Process
The Knowledge Base
Trade-off Analysis
Conclusion
References
Introduction
Building an enterprise application integration solution is difficult. These solutions need to integrate multiple business systems that were not intended to work together. Integrating such systems is hard for many reasons. These include the heterogeneity of the platforms and programming languages, the diversity and complexity of each individual business system, and the difficulty of understanding the requirements for the resulting integrated solution. Software architects undertake a number of crucial tasks during the design of integrated enterprise applications. Among these are:
- Helping understand the functional and quality requirements for the integrated applications.
- Creating the initial architectural blueprint for the integrated applications.
- Selecting suitable integration technologies that can fulfill the application requirements.
- Validating that the combination of the architecture and the integration technology used to build the enterprise-wide application are likely to be successful before a major implementation investment is made.
This article describes a proven approach to assist architects with evaluating enterprise application integration technologies. In particular, we focus our discussion on evaluating integration technologies, for implementing services-based integration.
SOA for Integration
With the advent of industry standards such as Web Services, Service Oriented Architecture is driving a paradigm shift in many areas, including enterprise application integration.
The services-based approach to doing integration is about integrating computing entities using service interactions. The services-based approach to integration addresses problems with integrating legacy and inflexible heterogeneous systems by enabling IT organizations to offer the functionality locked in existing applications as reusable services.
In contrast to traditional enterprise application integration (EAI), the significant characteristics of the services-based approach to integration are:
- Well-defined, standardized interfaces—Consumers are provided with easily-understood and consistent access to the underlying service.
- Opaqueness—The technology and location of the application providing the functionality are hidden behind the service interface. In fact, there is no need for a fixed services provider.
- Flexibility—Both the providers of services and consumers of services can change—the service description is the only constant. As long as both the provider and consumer continue to adhere to the service description, the applications will continue to work.
Technologies for Implementing Services-Based Integration
Technologies for building Services-based Integration need to have the following basic functionalities:
- Message delivery
- Intelligent routing
- Event services
- Application adaptors
- XML translation/data transformation
- Rules processing
- Web Services support
- Service/Process orchestration
- Business Process Management
- Business Activity Monitoring
Further, to ensure the success of services-based integration, the integration technology needs to have the following qualities:
- Scalablity
- High performance
- Security
- Manageability
As we can see, one of the major focuses is on using industry standards such as Web Services to enable the actual message delivery and various other advanced services, avoiding the problems of traditional EAI technologies—that is, the use of proprietary protocols for message exchanges. This way, services-based integration is a design pattern that ensures interoperability and true integration in any heterogenous enterprise landscape.
Evaluating Integration technologies
There are many varied implementations of integration technologies that provides the functionalities listed above. They range from the traditional EAI technologies with a Web Services features add-on, to brand-new implementations with inherent Web Services support.
Unfortunately, selecting an appropriate integration technology implementation is not a simple proposition for most IT organizations. There are numerous reasons for this, but they typically revolve around:
Technology complexity—Integration products are large, diverse, and literally have thousands of features and application programming interfaces. They are complex to understand and low-level details can have serious effects on the way a product behaves. The devil really is in the detail.
Product differentiation—There are tens to hundreds of products competing in the integration arena. At a superficial level, many have almost identical sets of features and capabilities. Price difference can often be very large, which further compounds the problems of selection and acquisition.
IT organization knowledge—End user organizations rarely have architects and engineers who have the necessary deep and broad understanding of all integration technologies and products. It is therefore a time-consuming, expensive exercise for the organization to acquire this knowledge in order to choose an appropriate integration product. It also distracts key engineering staff from their mainstream, application-focused tasks.
The i-MATE Process
i-MATE (Middleware Architecture and Technology Evaluation in Internet time) is a specialized software engineering process for evaluating Component Off The Shelf (COTS) middleware. It is suitable for organizations operating at Level 3 in the Software Engineering Institute's Software Acquisition Capability Maturity Model [1], especially in its support for the User Requirements and Acquisition Risk Management key process areas. The effectiveness and novelty of i-MATE lies in the combination of:
A defined process—This comprises a straightforward series of well-defined process steps for gathering, ranking, and weighting application requirements for integration middleware.
A knowledge base—This contains several hundred generic requirements for various classes of COTS middleware products, including those unique to services-based integration implementations.
A requirements analysis tool—The analysis tool enables rapid assessment, experimentation, and presentation of how the middleware products under evaluation compare against the project requirements.
The following sections describe i-MATE's unique features, which make it highly suitable for evaluating integration technologies in the context of building a services-based integration solution.
The Process
The process used in i-MATE is similar to those described in [2,3]. It defines the series of steps that are undertaken in i-MATE, and the decisions made and artifacts produced at each stage. These are depicted in figure 1 and are briefly explained here:
Elaborate customer requirements—This first step produces a document that captures the customer's requirements. As the technology and application problems are complex, we usually find that the overall requirements are not fully understood. Consequently, a number of workshops are held with the application stakeholders to elicit the requirements. The stakeholders involved ideally include both IT and business groups. The resulting document details the business and technical requirements that are specific to the need for integration technology in this services-based integration environment. Each requirement is expressed as a single item that can be evaluated against a specific integration technology.
Augment with generic requirements—This introduces the i-MATE knowledge base of over 200 generic, broadly applicable requirements for integration technologies. These augment the set of application specific requirements with generic integration requirements. The output of this step represents the overall application requirements for services-based integration technology, represented as individually identified requirement points.
Rank overall requirements—Working with the key application stakeholders, the overall set of requirements is ranked. At a coarse level, each requirement is deemed as mandatory, desirable, low priority/not applicable. Within each of these categories, importance weightings are assigned to give fine grain control over requirement rankings, in a fashion similar to [4,5]. The output of this step is a collection of weighted requirements stored in the i-MATE requirements analysis tool.
Identify candidate products—This step identifies the three to five integration products that are most likely to be applicable to the overall application requirements. In some cases, the customer has already identified a shortlist, based on both technical and business reasons. In others, we use our experience to work with the customer to identify the most likely candidates.
Figure 1. Evaluation Process
Product Evaluation—In workshops with the key customer stakeholders and product vendor representatives, we evaluate each of the candidate products against the overall requirements. Scores are allocated against each requirement point for each product and captured in the requirements analysis tool. This involves in-depth technical discussions, and stepping through relevant application scenarios to understand precisely how the integration products actually behave. In some cases, product capabilities and features can cause the process to iterate and refine the requirement rankings. Once all products have been evaluated, the requirements analysis tool automatically calculates weighted summary scores based on individual requirement point scores and weightings. Summary charts are also automatically created to support efficient presentation and reporting.
Scenario Analysis—By varying requirement weightings, the requirements analysis tool makes it trivial to explore various "what-if" scenarios and trade-offs. These can be used to further differentiate between candidate products, or confirm the appropriateness of a certain product under varying requirements. The output from this step is the recommendation of one or more products that can satisfy the application requirements.
Figure 2. Technology Considerations and Proof-of-Technology Prototype
Proof-of-Technology Prototype—When the outcome from the product evaluation is not 100 percent conclusive, a rapid proof-of-technology prototype is developed. The prototype typically implements one critical scenario that will exercise and/or stress the requirement(s) considered to have highest priority. Even very simple prototypes are powerful tools that give concrete, indisputable evidence of product capabilities. In several i-MATE projects, the results of prototypes have provided the final differentiation required to finalize product selection. In fact, a prototyping phase is always recommended in i-MATE, even if one product emerges from the process as a clear leader. However, when only one product is considered, the prototyping task is not competitive and can be scoped and structured more towards validating key application requirements.
In terms of resources, the Product Evaluation and Proof-of-Technology Prototype stages invariably consume most of the effort in I-MATE. Product evaluation takes, on average, between one to three days effort per product under evaluation, depending on the familiarity of the i-MATE team with the particular product. Prototyping is more variable, and depends on the complexity of the desired prototype. In most cases a simple system suffices, and the prototyping stage lasts less than one week. In other applications in which risks are higher, prototyping has extended to around one month.
The Knowledge Base
The i-MATE knowledge base contains an extensive set of generic requirements for more generic middleware technologies, as well as those specific to technologies that support service orientation. These generic requirements are derived from the practical experiences from CSIRO's Middleware Technology Evaluation project [6], working with product vendors and on consulting engagements for clients, such as in [7].
As there are different classes of middleware products, there is a different instantiation of the overall knowledge base for each class of products. For example, the knowledge base is versioned for Services-based integration technologies, Enterprise Application Integration (EAI), Application Server technologies, and CORBA technologies. We will look at the services-based integration knowledge base in the following example.
A detailed analysis of the set of generic requirements has resulted in each knowledge base being structured as a set of high-level categories, which encapsulate several individual requirement items. Presentation of the whole knowledge base is beyond the scope of this paper, but as an example, for the services-based integration technology version, categories may appear as follows:
High-Level Evaluation Categories
Each high-level category contains typically between 10 and 20 individual requirements that relate to that category. For example, the Web Services Support category contains individual technology requirements.
Web Services Support
These requirement points cover low level, detailed features of the integration technologies. All services-based integration solutions will inevitably require some or all of these capabilities. During an i-MATE project, the client is led through the contents of the knowledge base, and the importance of each requirement to the client application is determined. In some projects, the client is technologically cognizant, and the process is fast and straightforward, taking less than a day. In other projects, the client relies on the i-MATE team to explain the implications of many of the requirements, and their relative importance is set collaboratively.
In addition to the categorized requirements, the i-MATE knowledge base is populated with evaluations of various versions of major middleware products. Each product in the knowledge base is ranked on a scale of 1-5 against individual requirements. The rankings occur and are kept current through two mechanisms, as explained later and depicted in figure 4.
The first is the MTE project, which rigorously evaluates middleware technologies using a defined, repeatable approach [6]. The outputs of the MTE evaluations feed directly in to the evaluations in the i-MATE knowledge base. The second mechanism is i-MATE projects themselves. Clients often request that an integration technology or other middleware product or version that has not previously been evaluated is assessed during an acquisition project. In such circumstances, the i-MATE team works with the product vendor to rank the product features. The resulting evaluation extends the coverage of the products in the knowledge base and these can be reused in subsequent projects.
By reusing the generic requirements in i-MATE, organizations are saved the cost of developing their own set of integration technology requirements. Effort can thus be focused on capturing their application specific requirements, and planning and designing for the enterprise wide service oriented architecture. This saves both time and effort, and helps produce a low risk outcome.
Figure 3. Web Services Support
Trade-off Analysis
A custom requirements analysis tool has been built to support trade off analysis as part of the i-MATE process. The basic tool functionality provides the following:
- Capture of individual requirements points, both generic and application-specific, structured in to high-level categories.
- Capture of product rankings and requirement weightings.
- Instantaneous calculation of weighted averages for requirement categories.
- Instantaneous calculation and reporting of the evaluation outcomes using charts and graphs.
A screenshot of the trade off analysis tool is shown in figure 5. It is based on a spreadsheet program. The major strength of this approach is demonstrated during the Product Evaluation and What-if Scenario Analysis phases of i-MATE. As the spreadsheet is "live," any changes made to category rankings or requirement item weighting are immediately reflected in the graphs depicting the evaluation scores.
Figure 4. Populating the Knowledge Base
For example, in figure 6 the screen for setting requirement category weightings is shown. In this project, the Rules Engine, Development and Support, ** and System Management categories are deemed highest priority. These settings generate a set of graphs representing product rankings once the evaluation of the products is complete. At this stage, it is usually desirable to explore how the overall evaluation result may vary if one of these high priority categories is reduced to a medium level of priority. Changing any of the priority values causes the spreadsheet to instantly reflect these changed priorities in the evaluation results. This makes it feasible to rapidly explore alternatives and confirm the evaluation results under various alternative scenarios.
Figure 5. Trade off Analysis Tool
Figure 6. Category Weightings
Conclusion
The i-MATE process has been described as a process to ease the evaluation of integration technologies, within the context of implementing a service oriented architecture.
Integration technologies are complex, highly technical, and diverse collections of products that typically operate in mission-critical business environments. It is also a significant IT investment in ensuring smooth future integration. i-MATE's key contribution in easing the integration technology evaluation process lies in the combination of the following:
- A pre-fabricated, reusable set of generic requirements, based on the analysis of middleware components characteristics.
- A process for incorporating application-specific requirements, weighting individual requirements.
- Tool support for capturing and rapidly exploring requirement trade-offs and generating reports showing how the middleware products compare against the requirements.
The services-based integration approach holds the key to seamless integration and interoperability in the future. If things are done correctly, we should not be faced with the traditional enterprise application integration problems any more. With the advent of Web services, and the whole industry contributing and participating in the standardization effort for the first time in our IT industry, Web services and service oriented architecture hold the promise of solving the enterprise application integration challenge. Services-based Integration is an important pattern for implementing such a vision. The careful selection of an integration technology for this purpose is absolutely crucial in contributing towards the success of such a software engineering endeavor.
References
- J. Cooper and M. Fisher, Software Acquisition Capability Maturity Model (SA-CMM), Version 1.03, CMU/SEI-2002-TR-010, March 2002, at https://www.sei.cmu.edu/publications/documents/02.reports/02tr010.html
- S. Comella-Dorda, J. C. Dean, E. Morris, P. Oberndorf, A Process for COTS Software Product Evaluation, in Proceedings of 1st International Conference on COTS-Based Systems—ICCBSS 2002 Orlando, Florida, pp.86-96, February 4-6, 2002.
- J. Kontio, A Case Study in Applying a Systematic Method for COTS Selection, in Proceedings of the 18th International Conference on Software Engineering, pp 201-209, IEEE, Berlin, March 1996.
- J. Kontio, A Case Study in Applying a Systematic Method for COTS Selection, in Proceedings of the 18th International Conference on Software Engineering, pp 201-209, IEEE, Berlin, March 1996.
- Patricia K. Lawlis et al, A Formal Process for Evaluating COTS Software Products, Computer, Vol. 34, No. 5, May 2001.
- I.Gorton, A.Liu, Software Component Quality Assessment in Practice: Successes and Practical Impediments, in Proceedings of the International Conference on Software Engineering, Orlando, May 2002, IEEE, pages 555-559.
- Ian Gorton, Anna Liu, Streamlining the Acquisition Process for Large-Scale COTS Middleware Components, in Proceedings of 1st International Conference on COTS-based Software Systems, Florida, Volume 2255, pp 122-131, Lecture Notes in Computer Science, Springer-Verlag, Feb 2002.
About the authors
Microsoft Australia
Anna Liu is an Architect in the Microsoft Australia Developer and Platform Evangelism Group. She specializes in enterprise application integration projects, and is passionate about codifying good software engineering best practices, and accelerates enterprise adoptions of these practices and learning. Anna was previously a research scientist holding a fulltime appointment at the CSIRO, and a visiting scientist position at the Software Engineering Institute, CMU. She holds a Ph.D in computer science.
Ian Gorton
National ICT Australia
Ian Gorton is a Senior Researcher at National ICT Australia. Until Match 2004 he was Chief Architect in Information Sciences and Engineering at the US Department of Energy's Pacific Northwest National Laboratory. Previously he has worked at Microsoft and IBM, as well as in other research positions, including the CSIRO. His interests include software architectures, particularly those for large scale, high-performance information systems that use commercial off-the-shelf (COTS) middleware technologies. He received a Ph.D in Computer Science from Sheffield Hallam University.
This article was published in the Architecture Journal, a print and online publication produced by Microsoft. For more articles from this publication, please visit the Architecture Journal website.