Hi, I have a process which starts with an SFTP connector, retrieving files from a folder. This process will run daily, and there will be a person who puts the files in said folder manually every day.
The process has been built, but we are trying to improve it by making sure that files with the same name do not get processed more than once. This is in case if there is a mistake in the manual copying of the files.
I have tried to put "Get and Delete" in the SFTP operation, but we still feel this does not take into account the case where a person manually copies the exact same file in to the folder on different days by mistake.
Some ideas I came across in the community were to create a database with the file names, but I was wondering if others have alternative ideas. I am sure others have come across a similar issue or know a best practice.