We have one parent process reading a huge flat file from ftp location (100,000+ records) and this data is retrieved and passed to multiple child processes for further processing using "Data Passthrough" option.
Initial reading should consume space on server on which atom is installed (RAM or Disk). But does passing such data to child process again create a copy of data or just reference pointers are passed to child process?
My question is based on this link: High-volumetroubleshooting which says "Be aware that the Start shape consumes memory, especially when you build large master processes that initiate many subprocess calls that run asynchronously. Each subprocess produces many documents in the Start shape and they all consume memory until each subprocess is complete."