Why Point-to-Point Integrations Are Evil
Investing in P2P integration solutions is a recipe for long-term systemic problems
Salesforce has been growing at 30% year over year. In order to grow at this rate while supporting the business, IT has had to make compromises. Many of these compromises have been to defer infrastructure and architectural investments to a later date. Unfortunately, over time, IT will become the bottleneck or worse, the blocker for the company to deliver services to our customers. The diagram below depicts a representation of the current state of the P2P integrations within Salesforce. Unfortunately, at the time of the post, this is the internal infrastructure that the company relies on to run a multi-billion dollar a year business.
As seen from the diagram, quick P2P integrations can turn into a large headache. When your infrastructure only has a few components, P2P integration can seem like a lightweight way to connect everything together. Unfortunately, as the company grows, your integrations won’t stay lightweight for long. Many organizations have learned the hard way, an infrastructure based on P2P integration quickly becomes unmanageable, brittle, and damaging to both the IT budget and the organization’s ability to meet current and changing business needs.
No organization plans to have integration problems. Rather, P2P integration tends to build up over time due to sudden changes in business requirements and/or no comprehensive enterprise SOA integration strategy. Project teams are often over-committed with existing infrastructure enhancements to make time for better planning or assume that more sustainable solutions are too expensive or unproven. It often takes a major event; a bloated IT budget that can no longer be met, developer mutiny, or an update to a single part of the infrastructure that causes the rest of the components to fail, to make an organization realize the true extent of their integration problems.
The Hidden Cost of P2P
It can be difficult to see P2P issues when only working with 2 or 3 architectural components. But the number of P2P connections needed to integrate a given number of components increases exponentially as additional systems are added. This is an especially serious problem for companies that rely on connectivity with an increasing number of partners as a part of their business model.
For a project involving only 3 components, you’ll need only three connections, assuming di-directional in not needed. However, adding just two more components changes this number to 10 connections. An organization linking these ten components to every other component will require N(N-1)/2 physical connections, where N is the total number of components, or nodes, to be included in the network. When finished networking all ten components, we will be left with 10(10–1)/2 = 45 physical connections.
Every one of these connections represents lost development hours for net new business functionality, in addition to potential hours lost on development activities like documentation and overall maintenance on the environment. Maintenance usually comes in the form of high severity production bugs or code refactoring to accommodate business changes. We say “potential hours” here because many teams do not have the resources to keep all of their connectors up to date or refactor them to take advantage of the latest technologies, resulting in a large, undocumented tangle of code that also happens to be the most mission-critical part of the company’s infrastructure.
The P2P model does do an excellent job of keeping integration costs low at the project, or microeconomic, level. Since this is the scale at which IT historically evaluates success, it is no wonder that this is the predominant approach today. The ugly truth behind this approach is those big problems begin to hatch at the enterprise, or macroeconomic, level very quickly. Each new low-cost, tightly coupled P2P connection added to the enterprise has a non-linear, compounding effect on the total lifetime cost of the overall IT infrastructure. The P2P cost model increases exponentially as the size of the network increases linearly, as illustrated by the cost curve in the diagram below.
Looking at the cost curve and value curve together reveals a striking characteristic of the P2P network: the value of the network increases linearly over time while its costs increase exponentially. With the addition of each component, more money must be invested to eek out the same rate of return as the component before it. As a result of this phenomenon, short-term gains realized at the project level can explode into massive losses at the enterprise level. Indeed, this analysis paints an ominous portrait of traditional IT spending.
Note, part of the content for this section was repurposed from the Bottom Line SOA: The Economics of Agility document published by Enterprise Architect Marc Rix.
Security Concerns
Depending on your corporate values, enterprise data security is a close second to the cost of P2P integrations for companies. When there is no central governance for data security and/or compliance, it becomes very difficult to trace security breaches or audit who has access to sensitive data. With multiple IT teams, business teams, and application developers in a frenzy to get direct access to data silos in our Cloud and on-premise systems, it quickly creates an environment in which no one has any visibility into who, or what has access to what data. This also requires source and target system administrators to generate multiple API user accounts for each integration request. In general security terms, the more API user accounts available, the more exposure to risk the company has for security breaches.
Moreover, this also has a greater cost to the overall enterprise’s cost since teams may not be aware of another team accessing or updating the same data which results in data stomping. This was the case with Workday Market Segment data attribute in which the data values to critical business processes were being overwritten by another team without anyone’s knowledge, causing confusion and rework. This went on for years and resulted in a massive productivity loss and increased business risks for teams relying on accurate data.
Budget Constraints
When delivering a mobile app, web application, or new system to the enterprise, each project requires data. In order to access this data, each project typically budgets a certain percentage of its budget to build new or enhance integrations to access and/or update data. This is the microeconomic level planning as described in the “Why Point-to-Point Integrations Are Evil” section. These are dollars that could otherwise be spent on new application functionality rather than data integration needs. Surely there must be a better way for teams to access accurate enterprise data?
Limited Capabilities
A result of solving for project-specific requirements with P2P integrations is that critical business logic tends to be built into the integration layer. A typical example is hard coding specific business rules alongside the data transformations of an integration. This may seem like the right choice in the heat of a deadline, however, this approach not only creates greater maintenance costs in the future but causes confusion on the business and application development side since these business rules are not transparent.
This was the case with the creation of new hires into our Workday instance. Because no one had visibility to the business rules that resided within the integration, the application development team responsible for delivering new features failed to achieve their charter. A better option would enable the business logic to reside on the business application side so that the development team could quickly change and deliver new functionality.
Mergers & Acquisitions
Salesforce tends to acquire a large amount of 3rd party companies. More and more of these companies are being brought under the Salesforce umbrella as a separate business unit rather than acquiring the company for its technology and dismantling the company. This means Salesforce must integrate many of the front- and backend business functions that the acquired company uses to run the business. In some cases, this is a data migration project from the old system to the new system. However, it’s when the acquired system must be absorbed into the existing ecosystem that data integrations are important. With a P2P integration architecture, it’s costly, difficult, and time-intensive to quickly incorporate high-priority M&As into the ecosystem.
Lost Opportunities
Over time P2P begins to become very difficult to support and teams become increasingly afraid to make changes due to the unknown downstream impacts the change may have. At this stage, the IT organization has become a blocker for the company and ultimately slows the business down from its priorities resulting in increased time-to-market, thus equating to lost revenue and/or opportunities.
Supportability & Scalability
If you’ve gotten this far reading this post, then it should be easy to see why P2P integrations are not supportable or scalable for large organizations. To give you an example, there was a past IT project to upgrade from Oracle Financials 10g to R12; it was called Project Compass. At the time, the IT department had approximately 55 P2P integrations that required the addition of 4 new fields (Business Unit, Market Segment, Company, and Primary Coverage Country), as well as business rule changes due to the new fields. Recalling our formula N(N-1)/2, these four field additions resulted in 1,485 connections, over 1 year of development time, and a $1M investment was needed to simply make this integration change.