AnsweredAssumed Answered

large data Database Insert

Question asked by akumar034445 on Oct 20, 2016
Latest reply on Dec 4, 2016 by Srinivas Chandrakanth Vangari

Hi,

 

I have an 90 MB pipe delimited flat file with ~500000 lines.

I am trying to read the file and insert the records into Database table.

When i use the commit option as 'Commit by Profile' with dynamic insert.. the atom crashes.

The process reads file from SFTP and stores to Disk, deletes the file from SFTP, mapper mapping to DB profile and finally the DB insert operation. 

 

Question-1) Is it because of the size of the file? Not even sure if it crashed at mapper or at the step of DB insert.

Any way we can insert the data with commit by profile option? whole data at one shot.

 

As a next try, I tried committing with 1000 rows at a time. and have used 20 threads now.

 

It is taking huge time, the process is running for 2 hours..not completed yet... do not see any record written to DB yet and process is still executing..not even sure if this will work

Question -2) Any way to reduce the time ?

 

Even if this works..i would like the insert operation to be atomic (If few rows are inserted and committed and later if next set fails then would like the earlier record insert to rollback..) I can add try/catch and in catch block can delete the records for based on file name..

Question-3) But what happens in case of atom crash? is there any way to do rollback the inserted records?

 

Any tips on this? how this can be done achieved?

 

Thanks,

Anand

Outcomes