This is a continued blog from the one I wrote earlier today about Multiple Active Result Sets (MARS) causing locks on SQL Server.
In this blog I will show what happens on SQL Server if you keep result sets open during long transactions and we will discuss a possible workaround that I need your feedback on.
How to reproduce this?
Together with the project team we’ve reproduced this issue on the latest versions of Business Central, also with a relatively small database.
The examples shown in this blog are from a Business Central 14 system running on a SQL Azure database scaled as General Purpose – Serverless: Gen5, 1 vCore.
The SQL Statements to display the contents of the Active Result Sets are the IP of someone else, but I was given permission to share them on Yammer where most Business Central partners have access.
The database size used here is 77GB with Page Compression and the G/L Entry table has 14.641.928 (14.5 million) rows.
The code examples can be found on my GitHub.
The Loop
The code that we will execute is the following:
var
GLAcc: Record "G/L Account";
GLEnt: Record "G/L Entry";
begin
GLEnt.SetRange(Amount, 100, 110);
GLEnt.FindSet();
repeat
GLAcc.SetRange("No.", GLEnt."G/L Account No.");
GLAcc.FindSet();
Sleep(1000);
until GLEnt.Next() = 0;
Message('Finished');
end;
We create a loop searching for G/L Entries and for each entry we read the G/L Account and pauze for 1 second.
Imagine in real life, this is a business process. For example calculating Item Values for each Item Ledger Entry posted.
The Result
If you look at SQL Server you will see that it is keeping the dataset in memory for the Service tier to fetch the next 50 records.
SQL Server will keep the dataset in memory and try to manage it as long as the transaction takes.

The “Fix”
I am very reluctant to call this a fix but here is what we tried:
var
GLAcc: Record "G/L Account";
GLEnt: Record "G/L Entry";
GLEntTemp: Record "G/L Entry" temporary;
begin
GLEnt.SetRange(Amount, 100, 110);
GLEnt.FindSet();
repeat
GLEntTemp := GLEnt;
GLEntTemp.Insert();
until GLEnt.Next() = 0;
GLEntTemp.FindSet();
repeat
GLAcc.SetRange("No.", GLEnt."G/L Account No.");
GLAcc.FindSet();
Sleep(1000);
until GLEntTemp.Next() = 0;
Message('Finished');
end;
You can see that we put the first result set in a buffer table, in memory of the service tier. This will allow SQL Server to close the result set and the client to continue the process.
The Performance WIN!
In the tests that we’ve executed we noticed the process to be up until 2.5 or 3 times faster just by closing the dataset and reduce the pressure on SQL Server.
Business Central SAAS
This is what the difference is between MARS and Temporary Tables on Business Central SAAS


How about the Service Tier?
We debated how this would move the problem from the SQL Server to the Service Tier.
Based on the blog from Erik Haugaard we also tried a List object and a Dictionary object. This was however, slower than using a temporary table…
And what about the locking?
That will be discussed in the third and final part of this blog series…
Hej Marije, good to hear from you again.
Did you try the loop also with a FIND(-‘) instead of the FINDSET? Does that in some way reduce the time?
FIND(‘-‘) should issue a TOPx call instead of retrieving the full dataset.
https://docs.microsoft.com/en-us/dynamics365/business-central/dev-itpro/administration/optimize-sql-al-database-methods-and-performance-on-server
Just for my info: Why the GLAcc.SetRange and GLAcc.FindSet instead of GLAcc.Get?
LikeLike
I am trying to explain a problem, in order to explain it, I need silly code… I am aware of the silliness and of alternative solutions. Thanks.
LikeLiked by 1 person
Some rambling:
Apart from the performance improvement using the temporary record then wont the set we are looping through in the first example expand if someone posts to the G/L while executing the loop. That would probably come with a huge performance overhead and could also confuse the Repeat .. Until loop (at least if looping through an indexed set). Also trying to modify the GLEntry record in the non-temp loop would be very expensive if possible at all, i.e. we could cause a table lock. Modifying in the Temp loop would not be a problem, just need to get the physical record and modify, which would only lock the record.
Since we have a Sleep in there and not knowing the number of records in the findset, then it is difficult to state the performance improvement but it is anything from going from 50sec to 280 ms (50 records in the findset) or cutting it in half 1m 40 to 50sec.
If I have nested loops I use the temporary method. Then the Sql can use Bulk read which is much faster. Also I use the temp table method where FindSet(True, True) would be required.
I assume “GLAcc.SetRange(“No.”, GLEnt.”G/L Account No.”);” in the GLEntTemp loop is a typo .. should be GLEntTemp.”G/L Account No.” .. although it will probably not affect the duration much?
LikeLike
Hi Marije,
Unless I am missing something, the second part of the code should read:
GLEntTemp.FindSet();
repeat
GLAcc.SetRange(“No.”, GLEntTemp.”G/L Account No.”);
GLAcc.FindSet();
Sleep(1000);
until GLEntTemp.Next() = 0;
This further highlights your reservations about the solution by showing how easy it would be to introduce a bug.
LikeLike
It is a typo, I corrected it and pushed it to GitHub. It had no consequences on the test.
LikeLike
Hi Marije,
Thanks for this usefull article. And it is on the right time, as I am facing somes stranges issues, that I am convinced there are about performance… (in BC 14 OnPrem, with SQL OnPrem)
I explain: we have some job queue that post warehouse shipment; those job queue fall in error and got empty error message. When I am looking at the callstack, it indicates code line with a “IF NOT ISEMPTY THEN” or “IF FINDLAST THEN”…
Do you already face same king of issues?
Thanks
LikeLike