This issue has been sitting on my plate for months but not able to find the root cause. Your help would be appreciated!
Issue: I have a process that extracts various data (Event details, Budget details, etc.) based on multiple objects from a web service. The end system can take only 200 items per request therefore, I have split them into multiple batches of 200 and extracting data from the service. It has been observed that some of the items which are valid are missing in the final file. When I try to extract those records in build they come clean and no missing records. Since I cannot test for high volume records in build I am not able to simulate this in build. The records are missing when I execute the deployed process for an year of data. I have tried capturing the issue to a file location but no errors were captured. Tried with notify shape to load the failed records and in vain. Since each record has too many details, capturing them fails in truncation.
Observation: When I change the batch size to smallest possible (from 200 to 25 items per batch currently) I noticed that number of missing records has decreased from 17 to 2 but still the issue is missing records. Going down with batch size increase processing time and end up in huge delays.
I am not able to understand how changing batch size reduces missing count. I have even raised a case with support but not much help. What else could be the reason to drop records when extracting data in batches? This is strange and no clue on how to go about it. Hoping to get some light from your thoughts.
Thanks in advance!