A flat file that is 127MB.
20 to 30 lines will become on XML file.
A flat file to XML map in this scenario is generating "memory dumps".
running conditions are on a forked molecule with 4G per forked process.
1. can 127MB flat file to XML, resulting in 50,000 documents, cause low memory conditions on a 4G RAM
2. is there a way to use data process to split the 127MB flat file into smaller chunks while keeping the integrity of needed XML profile intact
3. Is it possible to affect such a split purely based on flat file profile (if so how will it know how many lines it needs to break!! - probably not)
4. Has any one got groovy samples working on a flat file profile to break the file into smaller chunks