Why best practices for Per Tenant Extensions?

About a month or so ago I did (or try to do) a webinar about best practices for Per Tenant Extensions. I was unhappy about the result but I guess the story should be told and I did promise to get back to you and finish it.

Well, I did and I am getting ready to start sharing what I think a “perfect” per-tenant extension should look like and as always I am looking for feedback and some interactive discussion.

Why just Per Tenant Extensions?

I believe we lost track of what we are good at as a community. I mean that in several ways but for this blog I will stick to Per Tenant Extensions.

Since we got AppSource a few years ago our community started partying away on it. This resulted in a whopping 1.800 apps for Business Central today.

Don’t get me wrong, I love AppSource and just like you I am proud of my contributions. However I do believe 1.800 apps is a bit much for our community.

This is most easily explained with an example.


Continue reading “Why best practices for Per Tenant Extensions?”

Making a Business Central upgrade (so much) easier…

Since the beginning of the year I have a new job/project for a partner in Denmark that you may never have heard of.

The reason for saying that is because they are (super) vertical and they don’t ever “sell against” other partners. The only competition is outside of the Business Central comminity.

What makes that cool is that I can essentially share anything I learn with you without feeling guilty I give away “IP” to competition. The better Business Central as a product is doing the better we can compete with other branche specific solutions. Win-win.

The Upgrade problem

One of the things I spent most of the september month on was to find and document a way to make upgrades from NAV to Business Central easier.

The good news is that we found a way to make upgrades up to 80% cheaper and to almost completely eliminate the dependency on highly skilled developers which in our ecosystem is the resource type that is the most difficult to find.

The Trick?

As part of the upgrade toolkit that Microsoft provides for Business Central is a piece that was designed for Great Plains (GP) customers migrating. This is called Table Mapping.

If you customize GP, a lot happens on SQL level, much more than in NAV where there is more meta data.

In order to be able to migrate custom GP tables to an extension to you lift a SQL Table to an extension without making it into an extension on premises.

How do I learn more?

Almost all of the content I created is un-gated and publically available. There is a blog, a github repository and a video.

Oh, and before I forget, there are several countries for which we look for partners to invest in our vertical solution.

Here are the links, and enjoy!



With love,


Would you configure a Porsche…

…if you knew upfront that you would never get one?

For obvious reasons I am getting a lot of questions lately about my transition and that is great. I love to talk about it and all the conversations I have help me process the transformation myself too.

Some people ask me how long I knew about being a girl and the answer is, all my life. Even in my earliest memories I see myself in my sisters room trying on her clothes. I remember closing my eyes and counting to 10, then opening my eyes and hoping to be in a girls body.

As a consequence of that, some people also asume that my transition is scheduled, or in progress. Some even thought that when I came out on LinkedIn last month it was the last step in my transformation and that the medical steps were already completed. Nothing could be more true.

I came out to my wife about three months ago and to my kids about a month or so later. Everything is new and fresh.

One of the most difficult things in the Male-to-Female transformation process is your voice. Once you have gone through puberty your voice becomes what we recognise as a male voice and no hormone therapy on earth will change that. The only way to (try to) change it is via voice lessons and it will take at least 8 months up until 4 years. It takes an hour of practicing each day.

This little piece of text is of no concern to the Business Central community other than the fact that it relates to the webinar I did for Luc last week and that I asked Luc not to publish on YouTube.

The title of my blog is, would you configure a porsche if you knew you would never get one. This is roughly the knowledge I had when I came out three months ago. I knew off course about the concept of genderdisphoria and that with surgery and hormones you could transform from male-to-female and vice versa. But I knew that in the same way as that I new Porsche had a model that was called 911. I did not know any details and why would I. I was married, I have 5 kids and a great career. Why would I spend a lot of time learning the details of a process that was not meant for me.

I have learned a lot in the last few months about myself and about the transformation process that I will go through in the next few years. It’s going to require a lot of my time and that time is going to come from somewhere as there are only 24 hours in a day and 7 days in a week. It means simply that I don’t have time to prepare webinars like you noticed earlier this week.

There is also another component that I had not realised upfront and that I tried to explain in a very poor way during the webinar last week.

As a person, I am finally happy with myself. I feel that as Marije I can finaly be the human being, a female human being that I always was deep inside.

The flipside of that is that before comming out, I was trying to compensate for my unhappiness. The Navision/Business Central community was one way of compensating and looking back over the years I have spent an unrealistic amount of time trying to get recognised. Which I did.

That does not mean in any way that during that period I was an unhappy person and the community, and especially the MVP award from Microsoft has brought me a lot.

Unfortunately the same award also brought unwanted side effects like I explained in the webinar like extreme jealousy from others that I did not know how to deal with. This, together with the pressure from Microsoft to be politically correct lead to the decision a few years back to leave the MVP program on a voluntary bases. Something that is unheard of.

This then again eventualy led to my comming out a few months ago, but that is a story for another day.

About Best Practices and our community…

Some of you were upset that I highjacked an hour of your life with my emotions while promising to talk about development best practices for per-tenant extensions.

The reason why the webinar is scheduled is that I have a cool new job which allows me to actually combine what I love and getting financial compensation for it.

Last january I joined PrintVis, an ISV from Denmark with a specialization in the printing industry. My primary responsibility at PrintVis is taking care of our global partner network and make sure that our partners get what they need from a technical perspective to do their job in an effective way.

Luc and I are close friends, we live at only half an hours drive from one another and we talk on a regular bases. I told Luc about the progress I was making in making best practices and we decided it would be cool to share it with the whole community.

My world was justed turned upside down only days before the webinar and I called Luc that I wanted to cancel.

Luc had no idea what was going on in my personal life, at that point only my wife knew that. He was a little pissed about me cancelling and I was a little pissed back that I was cancelling for a good reason but I was not ready to tell him yet.

When I called Luc in August to tell him about my transformation we also got to talk about the webinar and why I had cancelled it. We talked a little about if we should reschedule it and we both agreed that the topic was important and that nobody else was essentially picking up talking about best practices after I “left” the community a few years ago.

This was another “emotion” that I decided to share with you in the webinar that from what I could see on twitter a lot of you could agree on, which made me happy. My statement was that we are in a state of over-engineering things as a community.

The reason for over-engineering is an obvious one. With Visual Studo Code, Git, PowerShell, API’s and Docker we finally have tools to become proffessional. But this has lead in the last few years to extreme over engineering and partners having their best Business Central developers doing CI/CD instead of helping their customers.

I think we need to go back to our roots without throwing the baby out with the bathwater.

This is what I want to focus on in my new role at PrintVis and what I want to start blogging and evangalizing about.

I hope that you are patient with me and that you take the time and see what will come.

There is a lot of content in my head and great ideas that I have already discussed with my colleagues at PrintVis.

As a concrete takeaway you can use after this blog: “The best Per-Tenant-Extension” is “No Per-Tenant-Extension” and the software that is the easiest to maintain is software that you are not responsible for. We are not reusing each others ideas enough.

More is to come, and if you are having difficulties being patient, you can always talk to your manager and ask them to become a PrintVis partner. We are still looking for both implementation and service partners. Just drop me a line.

With that, this blog again comes to an end. Thank you for reading, get back to your families because it is weekend. I’m going to have a cup of coffee with my brother Rene who is on his way here on his Harley.

With a lot of love,

Marije Brummel

HELP! My entire Business Central SAAS stopped running!

Today was my first day back at the (home) office programming in 2 and a half months. I had already spend a lot of time in the last month or so changing email addresses and other account names to my new name but I only looked at my BC Saas sandboxes today.

When I looked at the admin portal it looked like this:

All environments where set to Not Ready, all options to restart, delete etc. were greyed out.

At first I thought it was me. That I had broken something in my Office 365 subscription.

I was wrong…

It turns out that Microsoft disables all tenants that are not in use for x-amount of time. You have to report a “production outage” to get it up and running again.

After I did that I was good to go in 45 minutes.

Thanks Duilio for helping me.

Are you ready to move forward “WITH”-out AL?

Sometimes I just have to write my frustration away in order to clear my head. Don’t expect technical tips and tricks in this post, but maybe some inspiration.

Today I was absolutely flabbergasted. Both on Twitter and on LinkedIn (I am a social media junky) there were actually threads about Microsoft removing the WITH statement in AL. I was litterally like OMG! Go spend your time on the future!!


I’m not going to spend more time on this idiotic topic than this. AL is a horrible programming language and in my future programming career I expect to spend less and less time each year using it.

What does your toolbox look like?

My father-in-law, may he rest in piece, could litterly make anything with his hands. He was a carpenter as a proffession but he could paint, masonry, plastering, pave roads, you name it and he could do it as long as he has the right tools, a good mindset and look at someone do it for a while to pick up some tricks.

As programmers we seem to be married into languages and frameworks and I can only guess why this is the case. In the old world were we came from which was called “On Premises” it was hard to have multiple frameworks, operating systems and databases work side-by-side.


We live in a new world called cloud, preferably the Microsoft Azure cloud and in this new world frameworks, databases and programming languages co-exist side-by-side just fine. Not C/Side is your toolbox but Azure is!

How I am migrating our 200GB+ Database to Business Central with 2000 custom objects? BY USING AZURE!!!!!

– Mark Brummel –

Quote me on that.

For the last year or so I’ve been preparing “our” Business Central SAAS migration and the first thing I did was NOT look at AL code and extensions. The first thing I did was to implement Azure Blob Storage.

The second thing I’ve implemented was Azure Functions replacing C/AL code with C# code.

The third thing I’ve implemented was JavaScript Add-Ins to work around limitations of the Web Client. I did this together with the fantastic team of Global Mediator which gave birth to a product called Meta UI which for those of you not to stuborn to “want to do it themselves” make the Web Client a fantastic place to live in.

Number four on my list was Logic Apps to replace Job Queue processes scanning for new files and enhance our EDI

Right now we are implementing Cosmos Database, with Logic Apps and custom API to reduce our database size and improve scalability of our Power BI

FIVE PROJECTS to move to Business Central SAAS WITHOUT a single line of AL code written and we started our project about 18 months ago.

The plan is to move to Business Central SAAS within the next 24 monhts with as few AL customisations as possible.

You know what is funny? The things we are moving OUT of Business Central are the things that make us agile. These are the things that we always have to make ad-hoc changes to why we love C/Side so much.

Please implement a new EDI Interface. Boom, done. With Logic Apps and an Azure Function.

Please change this KPI. Boom, done with Power BI.

Please make this change to the UI. Boom, done with Meta UI.

Oh, and off-course to not forget my friends in Denmark.

Please change the layout of this report. Boom, done with ForNAV!

My frustration is probably not gone, it won’t be gone as long as I read people on the internet still treating AL as if it were C/AL WHICH IT IS NOT!

Fortunately I have a fantastic new job at QBS which allows me to evangalise thinking out of the box and helping people get started with Azure. Only last week in a few hours I got a partner up and running with an Azure Tenant running Business Central on a scalable infrastructure to run performance tests.

Setting up Azure SQL Analytics (Preview) – Dynamics NAV

Telemetry is everything, you cannot have enough data when users start asking you why the system is behaving differently than yesterday or performance is changing over time.

This is where Azure SQL stands out from On Premises. You can get so much more data and in an easy way to analyse.

However, you need to know where to find it because not everyting is setup automatically after you create a database. Some is, some is not.

This blog is about how to connect Azure SQL Analytics to your Azure Monitor.

The steps how to do this are described in this docs entry and I don’t want to repeat existing documentation. I will add some screenshots of some results for a 220 GB Microsoft Dynamics NAV database with 80 concurrent users.


Step 1 – Patience!

After you have activated Azure SQL Analytics it will not be visible for a while. It takes time in the background to be generated and but together by the Microsoft Minions who control your tenant in the background. Remember that these Minions have labour contracts and a rights to have a break every now and then.

Step 2 – Azure Monitor & More…

When the Minions are finished the data will show up in Azure Monitor. Search for it in your environment

And then, at least in my case you have to click on More…

This should show a link to your Azure SQL Analysis. In my case with two databases. DEV and PROD.

Step 3 – The Dashboard

The first dashboard you’ll see is something like this, except for the fact that this shows data 24 hours after activation and we had a busy friday with a performance incident. I’ll get back to that.

There are some interesting statistics here already visible like wait stats, deadlocks and autotuning. I’ll handle wait stats in this blog and maybe I’ll get back to deadlocks and autotuning later. There is a “good” reason the autotuning is red and I’ll look at that tomorrow (sunday) when nobody is working on the system.

Step 4 – Drill Down | Database Waits

If we drill down into the Database Waits we see more details on what types of waits we are dealing with here.

It does not help looking at these waits without narrowing down into specific moments in time when “things go wrong” because specific events relate to specific wait stats and some waits are just there whether you like it or not. We all know CXPPACKET because NAV/Business Central fires a lot of simple queries to the Azure SQL engine resulting in CPU time wasted. There is not much you can do about that. (As far as I know).

Step 5 – Houston we have a problem!

It’s 3:51pm on friday afternoon when my teammate sends me a message on Skype that users are complaining about performance. Since we just turned on this great feature I decide to use it and see what goes wrong.

We drill down again one more time and click on the graph showing the waits.

Note that this screenshot was created a day after the incident but it clearly illustrates and confirms that “someting” is off around the time my teammate sent me a message. The wait time on LCK_M_U goes through the roof! We have a blocker in our company.

Hey, this is KQL again!

Now we are in a familiar screen, because this is the same logging that Business Central Application Insights is using. Drilling down into the graph actually generated a KQL query.

Step 6 – What is causing my block?

To see what query is causing my block I have to go back to the Azure Dashboard and click on Blocks like this

From here we have two options. If I click on the database graph I get taken into the KQL editor and if I click on a specific block event I get a more UI like information screen. Let’s click on the latter.

Step 7 – Get the Query Hash

This is where it get’s nerdy. The next screen shows the blocking victim and the blocking process.

It also shows a Query Hash.

This is where I had to use google, but I learned that each “Ad-Hoc” query targetted against SQL Server gets logged internally with a Query Hash.

Since NAV/Business Central only used Ad-Hoc queries we have a lot of them and it’s important to understand how to read them.

What worries me a bit here is the Blocking Process’ Status which is sleeping. I have to investigate this more, but I interpret this as a process that went silent and the user is not actively doing something.

Step 8 – Get the Query

Using Google I (DuckDuckGo actually) also found a way to get these queries as long as they still exist in the cache of your SQL Server. Simply use this query

SELECT deqs.query_hash ,
deqs.query_plan_hash ,
deqp.query_plan ,
FROM sys.dm_exec_query_stats AS deqs
CROSS APPLY sys.dm_exec_query_plan(deqs.plan_handle) AS deqp
CROSS APPLY sys.dm_exec_sql_text(deqs.sql_handle) AS dest
WHERE deqs.query_hash = 0xB569219D4B1BE79E

This will give you both the query and the execution plan. You have to use SQL Server Management studio to execute this against your Azure SQL Database

Step 9 – Restart the service tier

Unfortunately for me this journey resulted in having to restart the service tier. We could not identify the exact person/user who had executed the query that was locking. Maybe we will be able to do that in a future incident since I’m learning very fast how to use this stuff and time is off the most essence when incidents like this happen on production environments.

Needless to say that the NAV Database Locks screen was not showing anything. I would have used that otherwise off course.

Azure Application Insights 101

In my series around Application Insights for Microsoft Dynamics Business Central / NAV this is probably the most booring one. However it is quite important. In order to teach you folks about KQL and the Application Insights API etc.

Step 1 – Create Application Insights

In your Azure Tenant search for Application Insights and select Add.

There is not much to fill in here. The Resource Group is probably most important if you have a bigger Azure Tenant. You want to group your stuff together.

Step 2 – Grab the key!

After the resource is created grab the key to your clipboard and now leave the Azure Portal and move to the Business Central Admin Portal

Step 3 – Put the key in Business Central and Restart your system

Step 4 – Analyse the data

But that’s for the next blog, about KQL. This will be a language at least 1 person in your company needs to master. Definately.

Wait… is that all??

Essentially yes, but there is a caveat…

The million dollar question is probably whether or not to pot multiple customers into one Application Insights resource.

This probably depends on one question. Does your customer want to access the data? If they do, the data needs to be in it’s own AppInsights resource so you can grant your customer access.

The good news is, and we’ll get to that, is that you can query accross application insights instances.

Tip #69 | Default Implementation for AL Interfaces

I just love it when I get an error and nothing I search for answers what to do next.

Like this one

Value ' ' does not implement interface 'ForNAV Layout' and there is no default implentation for the mentioned interface.AL(AL0596)

There is no mentioning of default implementations in the Microsoft documentation.

And in fact, in this enum value, I do want a default implementation since “Empty” is a fallback since I want to use the new expandable and collapsable row feature in BC16.

The solution: this is a property on Enum level

DefaultImplementation = “ForNAV Layout” = “ForNAV Layout Default”;

The motivation here for me to work with an Enum and an Interface is that we have a partner that want’s to implement a feature called “multiple layouts” that we think does not fit with the simplicity we have in mind for our core product.

This allows the partner to create a new App in AppSource with a dependency on ForNAV that introduces new features that only a subset of our customers need.

The majority of our customers is not burdoned with unnessesairy complexity while the few who need it have a solution they can subscribe to.

That my friends is what we mean with Extendability by design.

Tip #58 | Run Extension Objects

One of the quirks of working with extensions is that you cannot run an object from the object designer. This is true for V1 and V2.

With V2 you can start an object (page) after deploy but this only works once and only in the WebClient.

If you just quickly want to check our a page or codeunit in the Windows client you can write a codeunit against an object that does not exist.

An example is the TowersOfHanoi app that Microsoft ships as example. This does not have a page extension to execute itself.

Works all the time.

Want to learn more about extensions? Contact me today!