Hello! I have built an AX>Host Analytics integration for a client for the their transaction data. We have now reached the deployment stage, and there is a significant number of records in these data calls. My most recent load was almost 600k records and 71MB at the DB query. I wasn't aware of the potential file size, partly because of the 10MB testing limit, but also because there are significant swings between months, and the testing period I was using was much smaller. Time needed for the process to run is not a huge factor, and the allocated RAM seems to be adequate as well. However, I am getting an error on every process >300k or so records with the following text:
"Error invoking soap operation; Caused by: java.net.SocketTimeoutException: Read timed out; Caused by: Read timed out"
The bigger surprise to me though, is that the data still seems to load. I get load confirmation emails from Host Analytics with a seemingly appropriate number of records loaded and the client has done some validation and everything seems to check out, though the validation was limited in scope. My assumption is because I am splitting/batching data by line into groups of 20k, so the first batch hits the load connector long before the last, and in that span the read times out.
My actual question: Is there anything I can do without splitting this into multiple processes, and do I even need to do anything since the data seems to be loading? The SQL query is already as scoped-in as we can get, only pulling exactly the accounts and other fields needed for 1 month of data. Thank you in advance for any insight.