Integrating Salesforce with other systems is inevitable as your business grows. But integrations have a way to expand in scope. It is not unusual to end up with integration scenarios that are four to five times more complex than what you started with. This is unavoidable because the complexity increases gradually as you add “one more small thing” into your integration scenarios. In this post, we offer some guidance on how to plan for maintainability of your integration scenarios.
The key to keeping integration simple is deciding where the complexity resides. It is often tempting to incorporate all business logic into the integration tool/layer, especially when the integration tool is easy to use with visual programming options. It is best to think of your integration scenarios as having three parts.
The Source System
The job of the source system is to send/publish the required clean data in an unambiguous format. Don’t try to format or incorporate data transformations in the publishing logic. This ensures that as you grow and add more target systems, the source system data remains true to the source. This also helps you move from a point-to-point strategy to a hub-and-spoke strategy less painfully.
The middleware is responsible for listening to the publishing source system and moving data without loss. This layer should be efficient and light. It should not attempt to implement heavy duty business logic on behalf of the target system. We have already seen the move from extract-transform-load (ETL) to extract-load-transform (ELT) in big data architectures.
The Target System
The Target System should implement the system specific intricacies of “absorbing” the data. This is crucial. Often the logic needed for the target system – the data formatting, transformations, business logic – is forced into the source system or the middleware. This goes unnoticed when you are doing a point-to-point integration between two systems. But you end up adding a lot of redundancy as you grow and add more systems in the mix. Carefully assess what data you really need to process live during integration and what you can delay until later.