Accessing huge database

Clovertech Forums Read Only Archives Cloverleaf Cloverleaf Accessing huge database

  • Creator
    Topic
  • #54335
    Hina Siddiqui
    Participant

      My process is a database(inbound) to FTP(outbound)

      The database is huge (137000+) records. When i start the interface i am getting an outofMemory error. I have attached the screenshot.

      I tried to change the config in the scope. I am attaching a screenshot of my config setting to. I selected a smaller set of rows. But still my process runs the smaller set and stops and does not read another set.

      Your feedback and help is highly appreciated

    Viewing 4 reply threads
    • Author
      Replies
      • #81052
        Peter Heggie
        Participant

          Have you tried a much smaller amount of rows, in Max Rows Per Read? For example, 1000 or 100? How many messages per second do you need to process?

          Peter Heggie

        • #81053
          Hina Siddiqui
          Participant

            i did try even 10000 but the issue that comes is that it processes 10000 records and stops. Does not proceed to next 10000 records.

            Could you shed some light on this issue?

          • #81054
            Bryan Dort
            Participant

              The database-inbound protocol does not move the db cursor to read the next set of records.  It runs the sql statement that you have defined, then runs the sql statement again at the interval that you have defined.  

              You are most likely seeing the same records being retrieved from the database at every interval.

              In order to eliminate the records that you have already read from being read again, you should create a read success action to update the database for those records with some type of flag or date/time stamp.  Then on your original sql, use that same flag to condition the reading of the records (WHERE).

              Bryan

            • #81055
              Terry Kellum
              Participant

                Provided that the database is static, many versions of SQL contain a limit statement such as LIMIT 1000,1000.  This means give me 1000 rows starting at row 1000.  You can increment the first element by the next batch of rows you need.  You may have to work with it to eliminate fencepost error.

              • #81056
                Hina Siddiqui
                Participant

                  Thank you so much everyone. I will try different methods and get back with the results

              Viewing 4 reply threads
              • The forum ‘Cloverleaf’ is closed to new topics and replies.