We have an automated job that uses the Core FTP command line to login to an sftp server and download new files that are constantly being dropped to a folder every 5 minutes from a third party. We are logging in every 5 minutes to download new files. However, because of potential timing issues, the host of the sftp server is using a process that sends two files in a group for each file. The first file contains the actual data we need while the second file is simply a signal file indicating that the first file is complete and ready for download. The purpose of this is to prevent us downloading a data file before it is completely dropped into the sftp directory. These files are no longer than 500 KB. Here is an illustration:
FIle 1: abc001.file
File 2: abc001.sig
FIle 3: abc002.file
So on each login every 5 minutes, we need to do a loop that reads each .file and checks for the corresponding .sig file before downloading the .file. Then, we can delete the corresponding .sig as it is not needed.
In the 3 file examples, above, we would download File 1 (abc001.file), delete File 2 (abc001.sig), but skip File 3 (abc002.file) because there is no matching abc002.sig file (assuming it would exist in the next run).
Not sure how to accomplish this with the command line options, other than use a third-party tool.