In this blog post there will be slightly more technical details than before, but even if you are not a developer, please do not leave yet, I will do my best to put it short and simple. But if you really are not interested in technical details and have only 30 seconds I still appreciate your time - please proceed directly to “Conclusions” :)
So excited that we were able to convert reports to extensions, the time has come to handle something more challenging. Namely, the simple integration using the files. It does not matter what the interface is supposed to do, but what’s important here is that the integration should read and write files on the server (without user interaction - no dialog displayed) - so it can also be running by the jobs.
It does not sound appallingly, but if you just give a try and use any file related command, like this one:
you will get this frustrating error:
The type or method 'Exists' cannot be used for 'Extension' development.
and instead of
Exists you can put any file-related command, you were planning to use…
There are exceptions though. Basically, you can now only use functions with the user interaction (opening file/folder dialog and download file function). You can check which function is allowed, by looking in the (old) Development Environment into function property named
FunctionVisibility - if this is not “External”, then it cannot be used in Extensions.
That was a major obstacle in our case and what surprised us even more, was stated in FAQ:
File APIs are not available in Extensions V2. What to do? Code that relied on temporary files needs to be rewritten to rely on InStream and OutStream types. Code that relied on permanent files needs to be rewritten to use another form of permanent storage. We are considering a virtualized temporary file system to make working with temporary files available in the cloud in the future.
I suppose the reason behind this, is that extensions should act as isolated entities, so that’s actually for security reason and it was designed like that. But then, if you really need file access you have to figure out a workaround.
It seems, that the only way you can exchange information with the rest of the world from extensions are web services. Moreover, in the new AL, Microsoft introduced dedicated types for this (e.g.
HttpRequestMessage etc.). Therefore, we thought: “Ok, that’s a pity that we don’t have access to the files in extensions, but let’s make a simple Codeunit in old NAV development environment (not extension) and then expose this as a web service so we can make use of that in extensions”. The first part (web service in NAV) was easy, but then it turned out that Http* types in AL does not support NTLM or Kerberos authentication (only basic or without authentication), so we abandoned this idea. You can of course set up NAV to use basic authentication, but then certificates are required, and it could be problematic in AL as well.
So, we had to go back to the drawing table to overcome this obstacle. We then decided to develop a standalone web service in ASP.NET. This should be running without authentication, but allowing only requests from local NAV (for security reasons). This worked well and from the AL code we were able to connect to that web service and call the functions to read/write files to/from specified folders and without user intervention. We used also JSON types for requests and responses to WS.
Of course, the weak point of this solution is that we need that web service, and moreover it has to be installed with every Dynamics NAV server (if there is more than one). But this is something we are aware and in the future planning to move to Azure functions if possible or make it part of extension when Microsoft allows.
For file interfaces in NAV, usually the best way to handle it, is to use
In our case, however, when we managed to get the content of the file retrieved (via external Web Service), the content was just plain text (even if it was XML file - we just received one long string). We needed someohw to pass this content as an input for XMLport. For this naturally we were looking for streams (in/out) to support us and indeed it worked as expected:
HttpRespMsg.Content.ReadAs(ContentFileInStream); Xmlport.Import(Xmlport::"Profile Import/Export",ContentFileInStream);
For export though you need to do a juggling act.
XMLport.Export is used with
OutStream, while in order to get plain text content and do something about it (in our case send it to web service so it can be saved as a file) we needed
InStream. It means that
OutStream must be somehow converted to
InStream. How? Don’t try with
CopyStream function, but instead do something else…
The approach with blob is following:
TempBlobrecord variable (temporary). In this record there is a field
Blobto which you can read and write.
OutStreamout of the blob field
InStreamout of the same blob field
InStream- or actually pass it whenever
InStreamis required (instead of files).
The other extension we were developing (or rather converting from old modification) was about sending reports by email. In the old version
REPORT.SAVEASPDF and then
SMTPMail.AddAttachment were used for this purpose, but now as you can imagine this does not work. However, one can quickly realize that instead of
REPORT.SAVEASPDF you can use
Report.SaveAs, where last parameter is OutStream and instead of
SMTPMail.AddAttachment you use
SMTPMail.AddAttachmentStream and together with Blob, the code looks like this:
SMTPMail.AppendBody(BodyText); Clear(OutStreamExt); tempBlob.Blob.CreateOutStream(OutStreamExt); if not SaveReportAsPdf(TaskQueueEntry, OutStreamExt) then Error(CannotCreatePDFErr); tempBlob.Blob.CreateInStream(InStreamExt); SMTPMail.AddAttachmentStream(InStreamExt,AttachmentFileName); SMTPMail.Send;
So, to sum up, the approach with
TempBlob is really usefull for us when working with extensions (particullary when developing integrations). I would say this has now become a new design pattern.
Of course on our way we experienced more things and faced some other challanges as well. Just to give you few more examples:
It was really challenging, yet exciting to start this journey to NAV Extensions 2.0 world.
It is not only about new fancy tools for developers, but in fact this gives real benefits. Thanks to the extensions, NAV is now scalable even better than before and upgrades are no more such painful major projects.
I personally also believe that this Extensions approach will have a spin-off in the better quality of the code and solution overall. This model enforces developer to rethink the solution and to make changes that are not changing the standard, but rather extending and building on top of that.
Taking into account also the Microsot Business Central (Dynamics NAV successor) and its roadmap, it can be that the extensions and this approach generally is a step on the way from monolitihic architecture to microservices. We will see…
Now, from the developer perspective:
Although AL is lacking some commands, it is still in 90% good old C/AL language. If you are a senior developer, don’t be afraid, no need to learn it from scratch!
There are some differences though, but with the snippets you can make it easily. The new language is still case insensitive, but as a default rather the CamelCase (
FINDSET is now
TestField, etc.) is used and keywords in small letters (
exit, etc.). This improves code readability.
Something that must be mentioned, is that we are lacking some types (dotnet for instance), nevertheless we have now new types instead in AL, for instance:
Lists (and this is not closed list, as Microsoft is adding new types in the future). Many .NET usages can be replaced directly by these new AL types resulting in much cleaner code.
If something is impossible to achieve in AL code, it is possible to prepare function in Azure and invoke it from AL.
Last, but not least - about AL - it is worth to notice, that there is a great community and the platform is constantly imrpoved by Microsoft, so if you face any issues check GitHub.
teventsub… From now, when I am not sure how the definition of a specific element looks like, I just start typing with
t. Snippets are also extremely useful at the beginning of the Extensions journey.
I know, it has been a friend for a long time already, but there are still many among us, who are afraid of it, and you should not be. This gives as a plenty of possibilities and it seems some tasks can be done only from PowerShell now.
On our Extensions 2.0 journey there was also another new guest. This guy was “GIT”. Something that is very natural in the modern software development, in NAV world it is a brand-new thing, but we liked it so much! No more extra version control tools in NAV, no more major problems with merging changes between object version and rolling back if needed.
And taking into account that this comes together with the Visual Studio code, you just need a service where your repository is stored (e.g. GitHub). It is not only usefully, but saves your time and money.
Now, after we are getting more and more experienced in Extensions 2.0, we can see more benefits of this programming concept. Also, at the same time it is easier for us to identify even more old customizations that are great candidates for extensions, so we can leverage them to serve our customer business. Also, we see that sometimes moving the old modification to the extension is not only about converting code but rather re-writing, re-designing or maybe even starting from scratch ask Why? Why do we need this? Why this was done in this way?, Why not to simplify this to serve the main purpose better?, etc. Yes, let’s start with Why? and move on slowly but surely towards even better future - also known as cloud and Business Central ;)
See what we have to offer - Careers