I have a Repeat\Until loop that builds records and inserts them into a temporary table. The table's primary key is "Entry No." with the records being inserted in increasing "Entry No." order.
Creating and inserting 250 records takes about 2.5 minutes. If I comment out the INSERT statement, but let all the record create\validate code run, the process only takes about 1 second to loop for the 250 records.
Any thoughts on where the performance bottleneck is? And how it could be improved?
The table has no event subscribers, and is not tracked in change log. Change log is not even active.
Sounds like the inserts are going across a WAN somewhere. But those numbers (2.5x60/250) would give a latency of 600ms, which would get you around the world twice.Maybe a routing issue or a BDC failure somewhere?
This was a process that was written several years ago and the customer has been using since. But with much smaller transaction sets. Only with their recent increase in transaction size, have the underlying performance issues become apparent.I tracked this down to some code that was doing some unneeded processing. it was running after each record was inserted to the temporary table. Rather than only once after all the records were added. This resulted in that code running 250 times instead of just once. Correcting this got the processing of the 250 records down to just 8 seconds