I had an interesting conversation with a SalesForce architect. He was a bit concerned about the number of logins Boomi will generated in the near future. Reasons:
- login/logout generally creates a lot of overhead
- there may be risks related to data volumes and exceptions
For your info: we currently have batch oriented only integration processes planned for SF. We do have an event driven architecture ready based on REST API's and message queuing. It has been implemented but we don't use it yet for multiple reasons. We discussed multiple solutions and idea's (in random order). Option 1 is technically possible with the current versions of Boomi and SF.
Q: But what about option 2 and 3?
- Replace the batch processes by an event driven architecture
- Let SF push events to a Boomi REST API in stead of retrieving the last modified records from Boomi
- Supported by Boomi: yes
- Supported by SF: yes, but requires a customization per event. We prefer to minimize the number of customizations in source systems like SF. In fact, SF itself has an alternative implemented called Streaming API's, see option 2.
- Implement SF streaming API's
- This uses long polling which can be a better solution for integrations that poll frequently
- Supported by Boomi: No => correct?
- Supported by SF: yes, is the preferred solution
- Salesforce Developers: Streaming API's
- Implement a connection pooling mechanisme from Boomi to SF
- Comparable to JDBC connection pooling
- Q: Is it possible to reuse the existing session by decoupling the login/session management from the part that handles the payload?