peter.ellebye467382

CSV to SQL optimization

Discussion created by peter.ellebye467382 on Jun 6, 2018
Latest reply on Jun 6, 2018 by peter.ellebye467382

Hi.

I've created a fairly simple process, reading a CSV file from FTP and inserting into MS SQL.

Process works just fine... but executes pretty slow.

Each file is quite large, around 15mb, app. 250.000 lines, each line generates a row in the database.

 

I'm looking for hint on what to do, and what not to do, when working with messages of this size.

 

 

When looking at the log after the process has executed, I can see that the split and map steps execute fast enough.

The slow part is the SQL connector step, inserting all the rows.

I have tried to commit by profile and by number of rows, but both settings give same slow result.

I have only tried to use Dynamic Inserts!

 

Hope you have some hints for me

Outcomes