FTP inbound error 426: files processed multiple times

Clovertech Forums Read Only Archives Cloverleaf Cloverleaf FTP inbound error 426: files processed multiple times

  • Creator
    Topic
  • #52204
    Robert Kersemakers
    Participant

      Hi all,

      We have an FTP-interface where we pick up files from a remote location. This remote location is behind a firewall and probably on a VPN as well. Normally we don’t have any problems, but sometimes (like this morning) we are notified that it looks like a file has been processed more than onze. Normally we get this notification weeks afterwards, so not easy to check and my first reaction always is: ‘They must have send the file more than once’.

      But this morning we got another call on this and I can see where things go wrong in the log:

      Code:

      [fset:read:ERR /0:   dsv_rdc_in:01/11/2011 11:26:35] Error while trying to retrieve 340666740.
                             Detailed error:server did not report OK, got 426
                             Curl errCode:18 Curl error: Transferred a partial file

      So my guess is: the specific file is picked up completely (not partially) and then CL will try to delete the file. This doesn’t work (error message?), so the file remains there and is picked up again at the next read.

      The number of times a specific file has been processed (up to 5 times in SMAT) corresponds to the error messages in the log: 4 error messages means that the file has been processed 5 times. The last time there is no error anymore and the file is deleted.

      Has anyone seen this behaviour and has been able to correct it? I can’t see any options in the FTP-configuration.

      Thanks,

       Robert

      Zuyderland Medisch Centrum; Heerlen/Sittard; The Netherlands

    Viewing 2 reply threads
    • Author
      Replies
      • #73401
        Russ Ross
        Participant

          Often a FTP user ID is partially setup so that I’m able to login to the foreign system via FTP and read or get the file but don’t have permission to delete the file.

          Before I use any automation like cloverleaf or a homegrown script to do FTP/get/delete, I do a manual test and directory listing to see the ownership and permissions of the file(s) created on the foreign system.

          Getting the same file repeatedly would indicate a lack of permission to delete the file, but when it goes away after 4 or 5 times of reading it that would be unexpected unless the foreign system is cleaning it up.

          The next time the file is out there, do a manual interactive FTP login to the foreign system directory and check the permissions and see if you are able to get the file and delete it.

          If you are unable to perform this test then no amount of configuration on cloverleaf will be of any use.

          Once you confirm the permissions allow deletion then looking elsewhere makes since.

          Be sure the file you perform the test on is created exactly the same as the one you are having problems with, which probably means let the application or script on the foreign system create it so the permissions are representative of what you are dealing with.

          Another thought is be sure you aren

          Russ Ross
          RussRoss318@gmail.com

        • #73402
          Richard Hart
          Participant

            Hi Robert.

            If you can get the sender to change,  then Russ’ rename option is probably the most reliable.

            When we use the FTP-interface we setup an Inbound TCL script that is called after an FTP listing (Directory Parse).

            In this we only return a file if it was in the previous executions list, so it less likely to be partially written.

            Where we have used the FTP package, we perform a similar action to the above and then rename the file to prove that it is not locked and only proceed if all is OK.

          • #73403
            Robert Kersemakers
            Participant

              Thanks Russ/Richard for answering.

              We already have a system in place, where all files are first placed in a temp-directory and then move the file to the actual directory. This should be true for the outbound files (that we deliver) as for inbound files (that the other party delivers). So that shouldn’t be a problem.

              And this interface has been operational for about a year now and works pretty good. Just at some moments we get these ‘multiple inputs’ and we only get informed about this about a day or more later. At that time the situation has returned to normal again and nothing weird can be found: file permissions are ok and the files are deleted after uploading.

              So this doesn’t look like some ‘common’ problem with FTP. I think I will need to create a specific solution and I can see 2 options:

              * I already write some information (name, size, date/time) of all the read files into a .csv file. When processing a new file, check against this .csv file whether this file has already been processed; if so: KILL. Should be pretty simple. And we only have about 200 files per day to process, so not too much I/O-performance.

              * With a Directory Parse I could try to first rename the file to a specific prefix and then only process the files with the prefix. However: I’m not sure I can rename the file (via FTP) in Directory Parse. So option 1 looks more feasible to me at the moment.

              Thanks!

              Zuyderland Medisch Centrum; Heerlen/Sittard; The Netherlands

          Viewing 2 reply threads
          • The forum ‘Cloverleaf’ is closed to new topics and replies.