Extending Role Centers

Today I wanted to extend a Role Center with a Page Extension and I noticed Microsoft has updated terminology.

HomeItems = Embedding

To add a list to the Home Items you must use Embedding.RC1

Other changes:

  • Related Information = Navigation
  • New = Creation.

RC2

The Result in the Windows Client is the normal behavior with grouping the Cue from the Activities which in this case was also created using a Table Extension and a Page Extension.

RC3

Great Job guys!

The FOB is planning its retirement

In software we’ve invented all kinds of terminology to make it sound as if removing a feature is great. Terms like sunsetting and retirement give you a great feeling. Who does not love to see the sun go down with a beer and a loved one.

In reality it means that software that once was is no longer and we’ve had plenty of that in our beloved Navision product, now referred to as Business Central.

FOB is an abbreviation of Financials Object, just like FLF is short for Financials License File.

We use FOB files to move objects from one database to another using C/Side.

In the future of Business Central there is no room for C/Side. Microsoft is currently working very hard to make the Visual Studio Code experience and the ALC.Exe compiler mature enough to replace C/Side. On top of that C/Side also manages the system tables and a lot more that “something else” has to take over.

But this post is not about C/Side. C/Side will be there in October when NAV 2019 is released just like the Windows Client and the Fob option. But that does not mean it is smart to continue using them.

NAV Architecture, Design & Code

Loyal readers of my blog will not be surprised to read that I am not a fan of how NAV is architected, designed and coded.

I’m not talking about the base design principles and patterns. They are fine and should be used, but also by Microsoft.

NAV is one monolithical application where the functional modules cannot be recognised in the code structure.

This is not a problem in C/Side because here we can easily filter on object names and object numbers and most of us somehow leverage the Version List to filter on objects that have been modified for a project or customer.

Making a Fob

When we make a Fob we apply a filter in C/Side and export the files. This export does not necessarily need to contain all changes. We can cherry pick.

I did this for one of my customers this week. We are hybrid in NAV2018 and have close to 2.000 new objects added to NAV. We wanted some of the modifications in DEV to ship to PROD, but not all of them.

This is easy these days by just marking the objects, exporting them, putting them into the Acceptance system, do one final test and go live.

What if we don’t have Fobs no more…

Let’s say that I move all of this customers modifications to an extension. I actually tried this a few times.

When NAV 2018 was just released I ran the ExportToNewSyntax and the Txt2Al and quickly gave up. I had hundreds, if not thousands of compile errors which were caused by many things starting with bugs in the export, the converter and then the fact that NAV 2018 does not support DotNET which crashes a big percentage of our code.

I did the same thing a few weeks ago and to my surprise I was down to 414 errors. Many things had been fixed and DotNET variables no longer error out although they don’t work yet.

When NAV 2019 ships in October these 414 errors are probably down to a number that is overseeable and fixable. DotNET will work and I can migrate everything to an extension.

Why is this a bad idea?

If I migrate all my changes to one super extension my life will become impossible. It will be very hard to work with Visual Studio Code because finding objects is hard. The compiler will be very slow because it has to evaluate 2.000 objects with each keystroke.

Working on the same database with more than one developer will be hard, even if we use Git and GitLens. Sure, Git will merge but it will make mistakes and I will loose time.

Moving parts of my changes which are finished and testing while leaving out ongoing mods will be extremely hard.

The Solution

I need to break down my solution into smaller projects with dependencies. If I break down my 2000 objects into groups that belong together and compile together I can ship different versions when they are tested. I can write automated tests for these components and have different developers working on different parts of the application without them running into Git Merge issues.

I can have junior developers working on simple extensions and have senior developers work on the core objects that are difficult to maintain.

How To Get There?

I don’t know. Because DotNET is not supported yet I’ve not yet had a chance to move stuff around easily.

In theory it should be as easy as moving .al files around into different projects and have the compiler test if everything still works.

You’ll probably run into the situation where you have to move code from OnValidate triggers to events because they belong in different components.

Dude, is this still simplicity?

Right. This is exactly why I am debating this for my customer. For the last 20 years or so we’ve been using NAV as a development environment. The solution has very little dependency on NAV.

With Extensions and Visual Studio Code Microsoft is moving into a more Object Oriented approach with dependencies and a high level of abstraction.

For this reason I am comparing NAV to .Net Core. This allows us to use the full C# stack, Entity Framework and any HTML front end framework like Angular, React or whatever the framework of the day will be in five years.

Extensions as Micro Services

Whatever I can do today as an Extension I do as an Extension. We have 8 extensions already and none of them have more than 20 objects.

With a little help from the ForNAV Object Explorer it is very easy to manage. Every extension gets 10 numbers assigned from our license. When we need more than 10 numbers the extension get’s too complex and has to be broken up into more components.

How Many Extensions?

We will probably end up with about 30 extensions, maybe less if we move some of our components to .Net Core.

In .Net Core we are going to take the same approach. Small projects on a shared core. We have four projects as we speak that have organically grown over time. This summer the plan is to synchronize them and generate a NUGET package that will allow communication with the Business Central API.

Who knows, this NUGET package might be available for you as well.

 

 

Move Bespoke Symbols to Production

Today I had a small talk with my brother about generating custom symbols from C/Side and how to manage that.

Here is how I did it.

At the company I work for we have a DTAP environment and we (normally) only code in Development. Fobs and Extensions are moved from Development to Test, Acceptance and Production.

The quesion is how to control your symbols and move them together with each iteration.

Surely you can generate symbols in your production database but that might not be super smart. Alternatively you can move your Application App file together with the rest.

You need two PowerShell commands.

Unpublish-NAVApp -ServerInstance NAV2018PROD -Name "Application"

Publish-NAVApp -ServerInstance NAV2018PROD -Path "\\Symbols\Application.app" -PackageType SymbolsOnly -SkipVerification

You need to make sure that the flags for Development in the Server Config are false.

Dev Endpoint

Breaking Symbols

Sometimes Symbols can break if someone in a Development database changes an object in C/Side without saving the changes.

Another developer doing an extension can be hurt by that if he decides to refresh the symbols at the same time.

The simplest and official answer is to introduce the concept of distrubuted development where each developer has their own Development system and you have a build server creating the test envinronment if the build passes and execute the automated tests.

But in our ecosystem many partners seem to prefer centralised development and then you could consider to also disable the loading of the symbols on object changes and introduce a symbol database. (SDTAP).

Versioning

Another alternative might be to introduce versions of the Application symbols but this is something I haven’t tried myself.

Not sure if that would work but I thought it was worth sharing.

Question: How do you manage your custom Application Symbols?

Performance Measuring of Large Reports

In the ForNAV standard report pack we have a few reports that are traditionally slow when running. One of my design goals when developing these reports was to see if I can increase performance.

The names of the challenged reports will sound familiar to those in our channel for a longer time.

  • Aged Accounts Receivables & Payables
  • Inventory to G/L Reconcile

The latter only exists in the North American localization but whomever spends a lot of time on MiBuSo has seen the questions on performance of these guys.

Why are they slow?

Both reports have slow performance because they loop through the entry tables one-by-one which means they get slower over time. Both reports were created a long time ago. In case of the Aged Accounts Receivables & Payables report it was done before we had detailed entries.

Exactly how slow?

So, this is the question everybody asks and the only true answer is “it depends”. It depends not just on the size of your database but even more on the ratio between Master Data and Entries.

Also, you need a reasonable amount of data to test this, not just a CRONUS database with Microsoft Demo data.

Long live the upgrade business

When I started my freelance career 12 years ago I decided to step into upgrades. Not alone, but with the help of my good friend Tom Wickstrom. Tom has probably done thousands of upgrades over the last decades.

Tom picked two databases for me that I’ve used to test with. One database is about 60 GB and the other is about 50GB. This is a good representation of a professional bespoke NAV system.

The ratio’s in these databases are different, especially at an Item level.

50GB System 10 Years of Posting Data System A
No. of Customers No. of Cust Ledg. Entries Ratio No. of Detailsed Entries No. of Items No. of Item Entries Ratio No. of Value Entries Ratio
2741 71583 26,1 160287 380 1948702 5128,2 8198945 4,2
60GB System 11 Years of Posting Data System B
No. of Customers No. of Cust Ledg. Entries Ratio No. of Detailsed Entries No. of Items No. of Item Entries Ratio No. of Value Entries Ratio
9463 269694 28,5 552562 134114 1146037 8,5 2015607 1,8

On average each customer has made between 25 and 30 purchases in 10 years. The number of sales per item is the biggest difference as is the amount of value entries per item entry.

How do we Measure

The databases are installed on the same SQL Server. The servers are warmed up. We run the report once before we measure the results and then we take the average of three adjacent runs. We run using the Windows client. No Azure, No Docker, No VMWare or HyperV. Pure iron, bare metal. Each drive is an individual 500 gig ssd drive 

SQL Version                       2012
NAV Version                      2017
ForNAV Version                3.1.0.1460
Memory                              32GB
CPU                                       3.40 Ghz. Intel Core i7-4770
Disks                                     C Drive  SQL installed here. w. Database & server executables
                                              E Drive  MDF database file is here
                                            F Drive  NDF database file is here
                                            G Drive  LDF database file is here
 

 

Microsoft’s Performance

Inventory to G/L Reconciliation System A 12:20 Minutes/Sec
  System B 7:09 Minutes/Sec
Aged Accounts Receivables System A 0:17 Minutes/Sec
  System B 1:07 Minutes/Sec

 

ForNAV Performance

Inventory to G/L Reconciliation System A 1:25 Minutes/Sec
  System B 4:00 Minutes/Sec
Aged Accounts Receivables System A 0:04 Minutes/Sec
  System B 0:08 Minutes/Sec

 

Conclusion

The ForNAV reports are up to 8 or 9 times faster than the Microsoft RDLC reports. The difference gets smaller as the ratio between Master Data and Entries gets lower which makes perfect sense.

How did we do this?

Well, although it is not a secret, I am not going to tell you. We wrote this blog post to trigger you to look at our product.

There are a lot of goodies in our report pack if you are a modern programmer. Where feasible we use the MVC pattern, Dependency Inversion and Polymorphism. This means that the Aged Receivables and Payables report use the same code where possible which then is reused in the Statement report.

ForNAVLayout

JavaScript Objects

We use JavaScript Objects to show grand totals. In ForNAV you can code in JavaScript which includes creating objects that help you have clean and fast front/end (report-side) code.

Prevent C/Side from using ID’s used by Extensions

Last week the inevitable happened. I created a page in C/Side with an ID that I had already been used by an extension.

Microsoft is aware of this issue but does not want to prevent it from happening.

The problem is that at first everything seems to work. Your new C/Side page will run just fine. I only noticed it after a restart of the Service Tier because this actually does a check but you have to dive into the Windows Event log to find it.

The Fix

Extension objects are stored in the NAV App Object Metadata table. You can write a SQL Trigger that checks if a record exists in that table with the same ID and Type. This should show a message like this.

Error Extension

The Trigger can look something like this:

USE [NAV] -- change DB Name here 
GO

IF EXISTS (SELECT * FROM sys.triggers WHERE object_id = OBJECT_ID(N'[dbo].[CheckExtensionObject]')) 
DROP TRIGGER [dbo].[CheckExtensionObject] 
GO 
CREATE TRIGGER [CheckExtensionObject] ON [dbo].[Object] 
AFTER INSERT
AS 
SET NOCOUNT ON 
 
DECLARE @ins_count int 
SELECT @ins_count = COUNT(*) FROM inserted 
 
IF (@ins_count <> 0) --BEGIN

IF ((select count(*) from [inserted] inner join [dbo].[NAV App Object Metadata] obj 
 ON obj.[Object Type] = inserted.[Type] AND obj.[Object ID] = inserted.[ID]) <> 0)
 RAISERROR('Object Already Exist as an Extension Object', 18, -1, '');
 
SET NOCOUNT OFF 
GO

With thanks to Jorg Stryk.