The Job Inventory Problem | Part I

Last week I promised to blog about something less nerdy, something functional. Let’s see if I can keep that promise.

Since September last year I’ve had the honour of working on an upgrade project as functional/strategic consultant. The customer is upgrading from Microsoft Dynamics NAV 2017 to Business Central running in the cloud.

The NAV 2017 system is customized and there is an ISV solution that does not exist in AppSource. The goal of the project is to migrate to Business Central with no customizations, only running with apps from AppSource.

From a business perspective, Jobs are the center of how processes are organized. This is fun, because my very first project with Navision Financials 26 years ago was also job oriented.

Job Inventory

When you create a Purchase Order for an Item, Business Central allows you to populate the Job No., Job Task Code and Job Planning Line No.

If you do that, Business Central will create an Item Ledger Entry with the correct Job No. populated, but it will also create a negative correction.

Business Central seems not to support Job Inventory.

Marije, that is not correct!

If you open the Item Journal page in Business Central you can make the Job No. and the Job Task No. visible.

When you do that and put some values in these fields and post a purchase you will see that you’ll get an Item Ledger Entry with the Job No. populated WITHOUT the negative correction.

This means that Business Central as-such is capable of having Job specific inventory, just not when Purchase Documents are being used.

Let’s wake up the developer!

If you can create Job Inventory from the Item Journal, but not from the Purchase Order the business logic is probably in Codeunit 90, not in Codeunit 22.

When we use Statical Prism we can confirm that.

In Codeunit 90 there is a function called PostItemJnlLineJobConsumption which does the negative correction.

And what do we see! There is an event called OnPostItemJnlLineJobConsumption with a Handled flag.

What if we write this code

[EventSubscriber(ObjectType::Codeunit, Codeunit::"Purch.-Post", 'OnPostItemJnlLineJobConsumption', '', false, false)]
    local procedure SkipItemJobApplicationPurchase(var IsHandled: Boolean)
begin
    IsHandled := true; // Just skip everything...
end;

That’s it!

OMG! If you activate this one line of “code” you can actually have inventory on Jobs.

Can it be that simple? What’s the catch? If it were this simple, then why does Microsoft not allow Job Inventory?

There is a catch!

It works, but there is another problem… and I will discuss that tomorrow or wednesday in Part II of this post…

Business Central Performance | An update from MARS

Last year I wrote a few blog posts about how Business Central uses MARS to control datareads between SQL Server, the middle tier and the front-end.

I wrote these posts as a promise for a customer to raise awareness for the problem with Microsoft. I also discussed the issue on Yammer.

Surprise! Look what is in 2023 Wave 1

The MARS issues almost always result from the locking behavior we inherited from the old Navision days when we had Table-Locking which means that once the table has been modified during a transaction each read after that is also locking.

Read the posts from last year and browse my blog for more information on that.

Rec.ReadIsolation

Microsoft is now introducing an option to conrol the isolationlevel which will, if implemented correctly reduce the stress on the SQL Server backend.

Stefano Demiliani has published a great blog about this yesterday.

Here is a link to Microsoft documentation

https://learn.microsoft.com/en-us/dynamics365/release-plan/2023wave1/smb/dynamics365-business-central/control-database-locking-behavior

Was this week to nerdy for you?

This week I have blogged three articles about very complex topics that are probably too difficult to understand for most of the folks working in our community.

Next week I will move to more functional posts. Sorry about the nerdyness.

The goal of the blog posts are always to educate, share and make you think.

I like to think that the result of last years blog posts about MARS has now resulted in a new feature and a happy customer in Belgium.

Business Central Cloud | Performance & SetCurrentKey

Imagine the following scenario:

You are the product owner of an industry specific solution that is used by dozens of companies with the same codebase. One day you get an email from a project manager telling you about a performance issue that they have at a specific implementation.

After careful investigation using Application Insights you find out that this customer has an issue in a part of your application where all other customers run just fine.

How is that possible?

The answer might be data distribution in the database. The expensive term that SQL Server specialists are using for this is “cardinality“.

If you implement Business Central for a webshop where customers sporadically buy items the cardinality of Customer No. can be equal to Item No.

If the same webshop now sells the same items over and over again to the same customer the number of records you’ll find with that filter may result in dozens of records and you might want to add a filter on “open” or the current Inventory Period.

Make SQL Server use the right Index

As I wrote yesterday you can now add indexes to your App using Index Packages. Keep the number of indexes in your AppSource App to only nessesairy indexes and add new ones based on the need during the implementation,

The big question is, will SQL Server/SQL Azure actually use the index?

Here the answer is SetCurrentKey

The Myth

For a long time there seems to be a myth in our ecosystem that with Business Central running on SQL SetCurrentKey is no longer important.

This statement is wrong. It is very important.

The reason why people think it can be removed is that SQL Server keeps statistics of your data and it will determine the index based on your filters and knowing the cardinality of your values.

OPTIMIZE FOR UNKNOWN

For Business Central this feature has been disabled by adding a query hint. This means statistics are a lot less important for determining the index.

ORDER BY

With statistics out of the picture the Order By clause becomes more important and this is converted directly from SetCurrentKey

SQL Server will look at your ordering and compare the Order By to the Clustered Index and then determine if it makes sense to use an index or not.

SetCurrentKey ≠ Key

Since a very long time it is allowed in AL to use SetCurrentKey with ANY field in a table, even a flowfield.

Needless to say that if you do a SetCurrentKey on fields that don’t have an index there will be a performance decrease.

Customize SetCurrentKey

In my previous post I learned you how to customize the keys. Now let’s look at how you can implement a new key in a codebase managed on AppSource.

For this we will look at the Microsoft BaseApp. Codeunit Whse.-Purch. Release.

        PurchLine.SetCurrentKey("Document Type", "Document No.", "Location Code");
        PurchLine.SetRange("Document Type", PurchHeader."Document Type");
        PurchLine.SetRange("Document No.", PurchHeader."No.");
        PurchLine.SetRange(Type, PurchLine.Type::Item);
        PurchLine.SetRange("Drop Shipment", false);
        PurchLine.SetRange("Job No.", '');
        PurchLine.SetRange("Work Center No.", '');
        OnAfterReleaseSetFilters(PurchLine, PurchHeader);
        if PurchLine.FindSet() then begin

The Event Publisher OnAfterReleaseSetFilters allows you to change the SetCurrentKey and influence which index SQL Server might use.

Be Careful!

Adding events like this to your App is like handing a PowerTool to a toddler if it is not used in a correct way.

But used in a right way it allows you to do performance tuning in cloud scenario’s.

Leave a comment

Do you like the blogs? Then please leave a comment. I am writing them as I am recovering from surgery and they distract me from the pain and being bored to death,.. Hopefully there will be a new blog tomorrow with an update from MARS…

Girls Just Want to Have Fun,,,

Business Central Cloud | Performance Tuning Rules 101

Talent can be a blessing and a curse. Or… as Johan Cruijff said: “every disadvantage has its advantage”.

“every disadvantage has its advantage”

Johan Cruijff

For as long as I can remember my talent has been making sure performance problems are fixed. I already did that in the days with Navision and it’s Native Database and later with Dynamics NAV on SQL Server.

With Business Central running in the cloud, performance problems don’t just magically disappear. Azure SQL can easily scale up and down allowing you to postpone performance tuning just a bit but eventually when your system get’s bigger you’ll end up having to take care of it, just as we always did in the past.

We have more tooling and telemetry than ever. That is not what this post is about. There are many people in our community writing excellent blogs about Application Insights and there is extensive documentation and training available.

Back to performance tuning Marije?

Hell no. Never again. I like working with people and doing performance tuning means spending hours and hours behind your screen analyzing telemetry and making decisions that result in carefully crafted index plans and sometimes changes to code or even business processes.

But, every now and then questions come my way about the topic. Not so much anymore because I killed all my old domain names and email addresses so you’ll actually have to ping me via LinkedIn and most people don’t bother to do that. Problem solved.

In this case the problem did find me… the customer (a partner in this case) actually called me on my cell. Guess I have to get a new phone number too.

I told them that I am not doing that kind of work anymore. I moved to the more functional part of upgrading/implementing Business Central and I was busy doing a project.

On top of that, I was preparing for a big surgery which meant both physically and mentally getting in shape in order to go into surgery as healthy as possible with the goal of a fast recovery. In other words… my mind was off, somewhere else.

However, this specific question ended up haunting me for quite a while. My brother Rene did some functional analysis to the processes and my former company SQL Perform looked at the SQL Server side of things.

That did not completely solve all things and I ended up talking to the customer care manager of the partner again just before my surgery.

We talked and decided that I would look at the problem as a distraction while I do initial recovery… which is where we are today.

So what is the problem?

In this specific case, the problem was performance of flowfields and/or locking.

Hasn’t this been an issue for a long time in our ecosystem? I spent hours, days, weeks of my life finetuning SIFT using SIFT levels (yes, I am old) and MaintainSiftIndex.

Later we started using covering indexes or indexes with included columns.

All of the above are now available directly from AL.

So when to use which?

In this case the development team was asking the question: “Marije, please tell us which one of these to use…. which one is best”.

And then the nasty answer is “it depends”.

So how do you deal with that if you have a large vertical solution and you want to implement a single codebase at dozens or hundreds of customers.

The Solution: Buy yourself as much flexibility as possible!

Since a short while Microsoft allows you to create Table Extensions with only new indexes that you can deploy as a per tenant extension on top of your functional app.

Essentially this means that SQL Server Indexes and Indexed Views are finally it’s own Object Type. Many of you will remember that I’ve asked Microsoft to do this many, many times.

And now it exists!

An Example

I have created an example repository on GitHub with two branches. One branch with a “vertical app” and one with an “Index Package”

The Vertical App has a table like this

table 60000 "Test Table"
{
    DataClassification = SystemMetadata;

    fields
    {
        field(1; "Entry No."; Integer)
        {
            AutoIncrement = true;
            DataClassification = SystemMetadata;
        }
        field(2; Description; Text[50])
        {
            DataClassification = SystemMetadata;
        }
        field(3; Amount; Decimal)
        {
            DataClassification = SystemMetadata;
        }
    }

    keys
    { key(PK; "Entry No.") { Clustered = true; } }

}

You can easily see that there is no secondary index. Only the clustered index.

What if you need an index on Description with Amount as SumindexField?

You can add that to your vertical solution, but maybe only one customer needs it, or maybe in a specific implementation the system will run faster if you have a covering index or included columns.

tableextension 60000 MyExtension extends "Test Table"
{
    fields
    {
        // Add changes to table fields here
    }

    keys

    { key(CustomerKey; Description) { SumIndexFields = Amount; } }

}

Instead of adding the index to your vertical solution, it might be an idea to create a table extension for it. This way you can solve specific issues for specific implementations without having the risk of over indexing your whole application and trying to make a once-size-fits-all solution.

The Rule: Buy Flexibility

There is no rule when to use SumIndexFields, Covering Indexes or Included Columns. It depends often on how your vertical solution is implemented.

The only rule that you can teach your development team is to allow the implementation teams to be flexible. You don’t want your developers to go into discussion about every single index that each implementation team needs in the field.

So Marije, can we still call you if we have issues?

That depends… I guess there is also no rule for that. Looking back at the last few years I did two or three performance projects. Often they are extremely complicated implementations with very large databases and hundreds or thousands of users.

The good news: the surgery is done, it went well and I am recovering. This will take up to six months and I am grateful that most of this will be in the summer period and I have scheduled to work a lot less in this period.

Life is better as a girl… just don’t tell the guys…

About Recruiters, Microsoft Azure & Dynamics 365…

I have not been blogging a whole lot lately, it’s been a bit quiet here. This is not because there is nothing to blog about but more about work-life balance.

In the last few weeks I have been communicating a bit with recruiters. I am looking for a new project somewhere in the September timeframe and I decided to change the setting for my LinkedIn profile to let recruiters know I am open to work.

To be perfectly honest, I have never actually done a project via a recruiter. Never in the almost 17 years of freelancing. It has always been word-of-mouth that led me from one project to the next.

It has always been a challenge for recruiters to figure out what “Dynamics 365” exactly is. Everyone in my network is familiar to be approached with Dynamics CRM projects while you are a NAV/AX specialist or the other way around. I got the feeling that this had gotten better over the years.

What surprises me now is that Microsoft Azure seems to be having the same problems. It has become to ambiguous for recruiters to work with.

Microsoft Azure is a massive stack of tools that span both infrastructure and business tools with complete logging and monitoring.

Somehow the penny seems to have dropped for the infrastucture part but it seems to be stuck there… as if Microsoft Azure is just the next version of VMWare.

My LinkedIn profile says that I am specialized in Azure… which I am, but I use it to solve business problems such as interfacing.

Dear recruiters… Microsoft Azure is so much more than infrastructure… and as part of our job automating business processes it will be more and more common to see Azure and Dynamics being combined into a skillset.

I admit that I don’t understand what it must be like to work in HR or to be a recruiter but it seems to me we need each other.

There are hundreds of projects to choose from and a balance between a challenging project and a short commute still seems to be the most difficult thing to find.

Oh, and about the blog… hopefully next week there will be a series about Jobs, Inventory and Warehouse Management in Business Central where I need the help of my community friends to figure out what is a smart way to move forward…