AnsweredAssumed Answered

How does Boomi architecturally address processing large data sets?

Question asked by SatyaKomatineni3761 on Oct 3, 2015
Latest reply on Apr 5, 2016 by James Ahlborn
Has any of you run into a documetn or post that addresses the architecture in Boomi that explains the scalability of processing large documents. Say a 100,000 documents at a time. Each document may be 4K in size.

I was told documents move as a batch across each step. Does this document set kept in memory as it moves along? or is it written to disk between steps? Or is it written to disk only under some circumstances? If so what are the circumstances?

Appreciate any documentation on this subject, be it in the references or community posts.