Using Azure Log Analytics on older Dynamics NAV versions

Sometimes there are topics that I could swear I wrote about and then someone makes you realise this is not the case.

This week that happened with my blog about what Page 9599 means when you see it popping up in Azure Telemetry.

Some folks on twitter started asking how it was possible that Super Users were changing data by running tables. I understand the confusion because in newer versions this is blocked by Microsoft.

But… older versions don’t support analyzing performance telemetry using KQL right? So this girl must be seriously confused.

Although the latter happens from time to time, this is not the case here. Because it is possible to analyse performance telemetry for older NAV versions with Azure Log Analytics and KQL.

I created some documentation around this when I created readiness materials for QBS Group earlier this year. Since not all of you are following their blog I figured it made sense to repost it on my own blog.

The Trick – Windows Event Log

To make this work, the trick is simply to enable writing content of the Windows Event Log to Azure Log Analytics and to create a few simple KQL Queries with regular expressions to analyse the data.

The result is this:

Chart1

And here is an example query

Event
| where ParameterXml contains "AppObjectType"
| extend object = strcat(extract("AppObjectType:\s{1,}([^\ ]+)\s", 1, ParameterXml), extract("AppObjectId:\s{1,}([^\ ]+)\s", 1, ParameterXml))
| extend executionTime = toint(executionTime = extract("Execution time:\s{1,}([^\ ]+)\s", 1, ParameterXml))
| extend query = strcat(extract("SELECT\s.FROM\s.WHERE\s.", 0, ParameterXml), extract("DELETE\s.FROM\s.WHERE\s.", 0, ParameterXml), extract("UPDATE\s.SET\s.WHERE\s.", 0, ParameterXml), extract("BeginTransaction\s.", 0, ParameterXml), extract("Commit\s.", 0, ParameterXml), extract("Rollback\s.", 0, ParameterXml), extract("INSERT\s.VALUES\s.", 0, ParameterXml), extract("SELECT\s.FROM\s.", 0, ParameterXml), extract("DECLARE\s.INSERT\s.", 0, ParameterXml))
| where ParameterXml contains "Message: Long running SQL statement"
| order by TimeGenerated desc

WARNING!!

Microsoft made small changes in different versions of NAV. You may need to change the regular expressions from version to version.

More Details can be found on my new github.

Enjoy,

Marije

Business Central Page 9599 | What is it?

Time for a quick blog.

Last few weeks I’ve been heads down in some performance tuning of Business Central using modern telemetry from KQL.

This is much more powerful than the old SQL Profiler since it allows you to see the stack trace in AL where the problems are caused.

AppObjectType: Page AppObjectId: 9599

This little guy was in my telemetry several times. Page 9599. And I could not find that page and the page was constantly built on a different table.

It turns our that if you run a table, a page object is built at runtime that gets page ID 9599.

The lesson I learned is that if I see this, most often an Administrator at the customer is trying to fix data.

If this happens often, talk to the customer and see if a more permanent fix can be applied. Teach the customer that doing this can kill performance for everyone using the system.

Building a strong, modern community | Tips & Tricks

The world is rapidly changing and with that, the way we interact and consume is also different today than it was yesterday.

In the last 10 weeks I had the pleasure of working on an interesting assignment for one of my customers to help them improve the interaction with their partners with a strong focus on technical content.

At first I was not that eager to start on it. It did not match my personal ambition of going back into technical troubleshooting and learning more about Azure and Dataverse.

Then I figured, why the hell not. I’ve done it before and I have been part of this digital transformation for years.

I like it when I’m challenged to formalize what I naturally do in a format that is reusable by others. The Design Patterns project and book are an excelent example. I don’t see a reason why we cannot do that with something less technical. Something like a “design pattern” for building a community.

The first communities happened more or less by accident and had a very nerdy character. Within the Business Central world DynamicsUsers and Mibuso.com are the best examples. The latter possibly the best where the story is that Luc van Dyck only meant to have a website to keep track of Navision stock prices which grew to what we know today as the BCTechDays event.

Later communities are created “by design” when marketing departments learned about the commercial value of the concept.

The most recent communities I worked on were the How Do I video’s for Microsoft, NAV Skills, the ForNAV Coffee breaks and QBS. I tried to analyze these gigs and see if I could transform them into a “recipe” for building a community.

Here is what I came up with… love to hear thoughts.

Step 1 – Pick a topic

To have interaction with a technical audience you need a good topic. Every odd six months this can be as easy as what’s new in vNext or you can check with hot topics from support.
Support is a great source of inspiration. It’s where things get fixed that go wrong. It does not have to be a programming bug, these are actually not good to use, it is better to pick a question that required someone to spend some time investigating. This will give the audience the feeling they are getting something in return for their time. Remember they also put in an hour or two of their week / month.
Don’t try to put too much in one webinar. It’s better to prepare something thoroughly. If you want to combine, make sure the topics are similar.

Step 2 – Prepare your video/demo

People love a live demo, but there is also a big risk that it can go wrong. Make sure you know what you are doing if you go live.
If your demo requires anything that takes time to prepare you can either choose to record it, or if your demo for example requires a machine to install software, make sure you have a second machine prepared where you can continue on the next step.
The advantage of a webinar, even if they are live, that they can be edited before putting the recording online.
Write down your text if you are unsure if you can remember what you want to say. Once you are more experienced you can write down keywords.
If your demo/story requires clarification, make sure to have a supporting PowerPoint, but remember that it’s a tool, not a goal. Your demo is what is most important.
Your PowerPoint should contain keywords and bullet points. A PowerPoint never contains sentences that others can read. The danger is that you will read what’s on the slides, which takes away the focus on the story. People may mute sound and fast forward the recording of your webinar.

Step 3 – Have a Fixed Format

Even though you probably do this webinar every week or month, the audience may attend for the first time. Each webinar should follow the same pattern with an introduction. This allows regular attendees to focus on their work during the first few minutes. You can choose to mix a general explanation and welcome with news about your community.
Do not, ever, never record the interactive part of the webinar. This ensures that the attendees are comfortable asking questions without fear of having a recording available.
If you have questions that are important to the story, make sure to record a Q & A afterwards and include it in the posted video.

Step 4 – Send out invites

Your audience is trying to run a business. They are busy and time is money. Make sure to remind them of your webinar and make sure the topic is clear. They may choose to skip it, not because they don’t like you but because the topic is something they already know about it or they may choose to watch the recording later.
Always link to the previous recordings in your newsletter.

Step 5 – Write a blog with the recording

After the webinar is completed and you’ve edited the recording you can write a short blog to go along with it.
Don’t try to repeat the content of the recording. Instead make sure that after reading your blog the audience wants to watch the recording.
At the end of the blog there should be a link to subscribe to the email that invites the reader to the next webinar.
Make sure to promote the RSS feed of the blog.

Step 6 – Promote the blog on Social Media

Share the url of the blog on Twitter and LinkedIn. Be careful not to overdo it. Social media platforms have smart algorithms to show content. It does not help to ask everyone in the team to share something as it will simply be filtered out or even be hidden as the content will not be unique.
The platforms are also smart against having the same people like the same kind of content over and over.

The most important ingredient

A lot of companies are making an attempt to build a community and if I would have to guess, less than half make it and are a succes. The ones that make it have strong, unique and honest content. The most commonly made mistakes is to make it too obvious that your community has a commercial character.

That does not mean your platform cannot support your business. Everyone understands in the year 2021 that a blog, mailinglist or videochannel has a commercial reason. Just make there there is balance.

One last tip!

Video content is hot and it works well with a blog. This means that to be succesful you need to learn video editing.

Ever since I started doing video I’ve used the software of Camtasia. The great people of TechSmith have let me use their software for free because of my community influence. I thought that after this many years a big shout out is well deserved. Thanks guys!

Extending the same object twice in one Extension

I’ll be honest. I was a bit disapointed after I had published my previous blog. Not about the content but about the number of people commenting and replying on twitter.

I talked to a few people in person and they said that it was a bit complex and maybe not everyone completely got what the problem was that I was trying to solve.

The problem is simple. You cannot extend the same table, page or report more than once in one AL project.

Did you know that in the prototype of AL that was given to us as christmas present several years ago it actually did work? At least you could create a second extensions without fields.

I complimented Microsoft about it. I was happy that we could “put things where they belong” in a project with a proper structure.

The answer was “oops”, this was not supposed to work, because you can only have one SQL companion table per table.

I bet that if they spent a few more hours they could have made the compiler and engine smart enough to group these together and solve the problem alltogether.

Why did you say again we need this?

If you want to organize your PTE in modules you need this. I wrote about this in one of my earlier blogs.

And what is the workaround?

That is working with PreProcessorSymbols and Code Cloning.

Maybe that clarifies the blog I wrote a few days ago and get a bit more comments up and running.

Can we fix the Code Cloning?

Yes! I also discussed this with a few people and hopefully I will blog about that somewhere next week.

Enjoy,

Marije

PreProcessorSymbols & Per Tenant Extension Best Practices

Let’s continue where we left off last week when I shared with you two blog posts about my opinion regarding best practices for Per Tenant Extensions.

I used you as a guineapig for the project I am currently working on a PrintVis to get some early feedback from the community before I pitched my ideas to the development team there.

In short I can say that it went both good and bad and I’ll explain a bit why.

The biggest problem is perhaps that an average PrintVis implementation does not require that many customizations. The solution has been implemented about 400 times in 25 years and it is very mature. Most projects would not have more than the “Core” and “Report Pack” folder.

That does not mean they did not like the idea of having more complex modules in separate folders and make them compile individually.

At first I thought that the next blogpost in this series would be about the folder structure of the “Core” module, but I decided to skip that until the next post and move to the most frequently asked question I got from both the PrintVis developers and the community.

How the heck do you work around not having dependencies and multiple table and page extensions in one project?

— Everyone…

The solution here came from my good friend Michael Nielsen as he pointed me in the right direction.

PreProcessorSymbols

The AL development language is based on C# even though it’s core syntax is based on pascal. – Confusing –

Everything we do in AL is actually converted into C# code. In the old days you could even debug this generated code. I cannot believe I am calling this the old days since I remember the demo at Directions NA like yesterday. I am getting old.

Since C# is essentially the base of our language, most new features we get are actually copied from this into AL. We are moving into a hybrid Pascal/C# language. #Fun…

A very clear example of this is the use of the Dictionary type which works almost exactly the same as in C# and allows you to run AL code a million times faster than the old temporary tables did.

Another thing we got from C# are PreProcessorSymbols. They have been with us for quite a while and they are extremely powerful for clean code fanatics like me.

What does it do?

The first thing you need to do is to add the PreProcessorSymbols tag to one of the App.json files you are working with.

Personally I recommend adding it to your PerTenantExtension and code exceptions against it. This way your modules don’t need it in the app.json and you cannot forget to put them in or remove it when maintaining them in their own Git repositories.

As you know, I like descriptive names, so we call this one “PerTenantExtension”.

The next thing you do is to add exception code to duplicate objects. Whenever you need a table extension or a page extension in a module, add it in two places and use this in the module folder

This means that if your app.json file contains the PerTenantExtension tag it will compile the code, but else it will ignore it.

But this is code cloning!

Yes it is. And that is all that can be said about it. It is duplicate code, it is error prone and it requires discipline.

Unfortunately this is the only way right now.

BUT!!!

Not all is lost. What if we find a way to manage this somehow with a Visual Studio Code extension? What if there were an extension that “recognises” this tag and handles this for us in our “Core” extension.

After my miserably failed webinar I got a few offers from community members to investigate this and I plan to spend some time trying to get this organized.

And what about Microsoft?

Another solution could be that Microsoft pitches in and allow us to have multiple Table and Page extensions in one project and merge them into one C# file at compile time.

It would be wonderful if they could do that, but as there are procedures we probably first need community buy in, pitch it as an idea on the ideas website and then upvote it.

That may take some time, but it may be worth it.

It’s worth the discipline!

If you want my personal opinion, it’s worth the effort and discipline. If I were owner of a Business Central shop with a few hundred customers this is what allows you to manage customizations without the hassle of dependencies, maintaining AppSource apps and more.

Customers will be on different versions right?

Let’s compare this way of working to dependencies and AppSource.

Personally I think dependencies belong in AppSource. It’s way too complicated to maintain dependencies for multiple Per Tenant Extensions. It may be possible when you are doing the initial implementation and everything still lives in your head, but if the customer goes into production you’ll forget. Someone else needs to maintain it and the’ll spend ours untangling your dependencies.

“When I wrote this, only God and I understood what I was doing. Now, God only knows.”

— Unkonwn

Do customers really want updates?

When customers are happy and up and running they often don’t want updates.

Let’s say that after the first implementation you took a module and added things for a second customer, do you really think your first customer actually cares? And you may have introduced a bug for the initial customer.

If you clone a module into a Per Tenant Extension your customer will be on that version until you explicitly decide to upgrade them and then you can manage it.

You can have a situation where you visit the customer six months after production, have a cup of coffee and tell them how you enhanced the code and sell them an upgrade, with some consultancy hours to.

If your module were on AppSource the customer would have gotten it for free at a time they did not want it and be upset and demand you to spend time fix it for free.

Your Feedback Matters!

Best Practices only work in a community! I enjoyed all of last weeks comments and used them to improve and learn. Please continue to leave comments here, on Twitter, LinkedIn or simply send me an email.

Thanks and with love,

Marije