Monthly Archives: September 2020

Reporting and document generation in Power Platform

For some reason, the part of Power Platform that used to be Dynamics CRM (and was, then, transformed into what’s called “first party” applications) has always been limited in its reporting and document generation capabilities. It’s still the case, it seems, and it’s a strange situation for an enterprise-grade platform. After all, what’s the point of having all that structural data at your disposal when you can not even report on it or, for that matter, issue a detailed invoice to the client?

Although, you might not agree with what you just read, so I’ll try to explain.

Back in the early days, we had Advanced Find and SSRS for reporting, yet we had Mail Merge for document generation.

Of course there was that reporting wizard (which is still there), but, quite frankly,  I don’t think it was ever up for the job.

The problem with all those “technologies” has always been that they were never user-friendly. There is no way a business user would ever want to write an SSRS report. Advanced Find has never been about reporting – it’s mostly a data query tool. Quite frankly, I don’t really remember a lot about Mail Merge, but, since it was deprecated  and replaced with document templates years ago, there is not a lot to talk about anyway.

In short, there was no user-friendly report generation tool which would be natively supported by Dynamics CRM in those early versions. And there was no user-friendly document generation tool either.

Even when using SSRS(and that would require a report developer), it was close to impossible to schedule reports, to send them by email, etc (on-premise was a little different, but does anyone remember there was… oh, wait… there is still is… on-premise version?)

In the recent years, we’ve got cloud version, then Power Platform, we got document templates, we still have SSRS, and we got Power BI.

But, if we take a closer look at all this variety of tools, here is what we’ll find:

  • Word Templates have tons of limitations(100 related records max, 1 level of relationships only, 1 root entity, not sorting or filtering on the relationships, etc)
  • It is possible to use Word Templates in Power Automate, but they don’t seem to support nested repeaters, yet they are not that well integrated with Model-Driven
  • On the reporting side, it is still possible to create SSRS reports. With all the same limitations we always had there – no scheduling/automation, no “self-service” (need a developer). But, most importantly, SSRS authoring tools for Dynamics 365 have been pretty much abandoned by Microsoft – the last update was released in 2017, and, even then, it would only work with Visual Studio 2015 at best (I am writing this in Q3 of 2020). If it’s not the definition of being “abandoned” then what is?
  • There is still advanced find, of course. Which is not that much of a reporting tool
  • It’s possible to do certain things in Power Automate (by creating an HTML table, for example). But, again, this is not a reporting/templating tool at all


I guess this is where I should have mentioned Power BI, and this should have been the end of my post… if not for the licensing.

Power BI does not come with Power Platform or D365 licenses. Which means if we wanted to stick to Power Platform / D365 licenses only, we would have no reporting tool. Which is kind of interesting, since I’d argue that any decent enterprise implementation would need a reporting tool, and, given that SSRS does not look like a well supported option these days, Power BI seems to be the only other option.

However, where Power BI Pro might look relatively inexpensive, it’s not, necessarily, what we need. If we wanted to do the same kind of reporting we cold do with SSRS (so to generate PDF files, for example), we would have to get Power BI paginated reports

Actually, there is a good hint there of why SSRS might never be coming back (I’m not saying it won’t be coming back to Power Platform, but I definitely would not bet on that happening any time soon):


If Power BI Report Builder is sharing the same foundation as SSRS, why even bother to keep supporting both?

However, what it all means is that if we wanted to use Power BI (and, again, it seems there is nothing else that would be available “out of the box” and that would be able to cover reporting / document generation needs in model-driven apps), we would have to

  • Get Power BI Premim for the organization (since that would allow the organization to generate pretty decent paginated documents)
  • Get Power BI Pro for every user who should be able to create and share those reports



A fair question would be: at which point this is all becoming cost-effective and how big the organization should be to benefit from this? It seems it would have to have at least a few hundreds of of users.

Those implementing Power Platform on a smaller scale are still stuck with the same old question: what tool should we use for reporting / document generation in model-driven apps?

Actually, I don’t have a good answer to that – there are some awesome third-party tools such as Xpertdoc, for example. However, the only reason they exist is that Power Platform itself is not offering those capabilities out of the box.

PS. In the next post, I’m going to demonstrate how to create an Azure function that may help with document generation (and that can be utilized from Power Automate), but, even so, that’s not going to answer the question above. This is one of the options I looked at on the project, and it might still be my  backup plan, but I’d very much prefer for those reporting / document generating needs to be covered out of the box.

PPS (Oct 2): that Azure Function post may have to wait a little – had to dig a little more in to the paginated reports

From “just make it work” to Low-Code to Pro-Dev

imageA few years ago, there was a common mantra on pretty much any project I was on:

“Stick to the configuration. There should be no javascripts and/or plugins”

This was happening since quite a few people had run into problems with those low-level customizations in the past. Which is understandable – to start with, you need somebody who can support those customizations moving forward, and, quite often, even plugin source codes would have been lost.

That’s about when Microsoft came up with the concept of “low code” – those are your Canvas Apps and Microsoft Flow (which is Power Automate now). It seemed the idea was quite ridiculous, but, by constantly pushing the boundaries of low code, Canvas Apps and Power Automate have turned into very powerful tools.

Which did not come without some sacrifices, since, if you think “low code” means “low effort”, it is not, always, the case anymore. Learning the syntax, figuring out various tricks and limitations of those tools takes time. Besides, “low code” is not the same as “no code” – just think about all that json parsing in Power Automate, organizing actions into correct sequences, Writing up canvas app formulas, etc. Yet, it presents other problems – what somebody can do easily with a few lines of code may actually require a few separate actions in Power Automate or a tricky formula in Canvas Apps. Does it save time? Not necessarily. Does it open up “development” to those folks who would not know how to create a javascript/.NET app? For sure.

In the meantime, plugins and custom workflow activities were still lingering there. Those of us not afraid of these monsters kept using them to our advantage, since, for instance, there are situations when you need synchronous server-side logic. Not to mention that it may be faster and easier to write for loop in .NET than to do it in Power Automate. But, it seemed, those technologies were not quite encouraged anymore.

On the client side, we got Business Rules. Which were supposed to become a replacement for various javascript webresources… except that, of course, it did not quite work out. Business Rules designer went through a few iterations and, eventually, got stuck at the point where it’s only usable for simple scenarios.  For example, if I have 20 fields to lock up on the form, I’ll go with javascript vs business rules designer since it would be faster to do and easier to support. For something more simple, though, I might create a business rule.

But then we got PCF components, and, so, the whole “low code” approach was somewhat ditched.

How come?

Well, think of it. There are lots of components in the PCF gallery, but none of the clients I know would agree to rely on the open-source code unless that code is, somehow, supported. And, since a lot of those components are released and supported by PCF enthusiasts (rather than by Microsoft partners, for example), there is absolutely no guarantee that support will last.

At least I know I can’t even support my PCF components beyond providing occasional updates. Basically, if there is a bug… and if you discover it once the component is in production… you are on your own.

Which means anyone considering to use PCF components in their environments should assume that a pro-dev person would be required to support such solutions.

PCF is only one example, though. There has been a lot of emphasis on the proper ALM and integration with DevOps in the recent years, and those are topic which are pro-dev by definition.

What else… Custom Connectors? Data providers for Virtual Entities? Azure Functions to support various integrations and/or as extensions for the Apps/Flows? Web resources are still alive since there is no replacement (PCF-s were never meant to replace the web resources), and plugins are still there.

The whole concept of Dynamics CRM/D365/PowerApps development has come a full circle, it seems. From the early days when everything was allowed, all the way through the days when scared clients would insist on not having anything to do with the plugins/javascripts, and back to the point where we actually do need developers to support our solutions.

So, for now, see ya “no code”. I guess we’ll be there again, but, for the time being, we seem to be very much on the opposite side.

Connection references

Connection references have been released (well, not quite, but they are in public preview, which is close enough), and, from the ALM support perspective, it might be one of the most anticipated features for those of us who have been struggling with all those connections in the Flows (chances are, if you are using Flows, and if your Flows are doing anything other than connecting to the current CDS environment, you have most likely been struggling).

The announcement came out earlier today:

And, right away, when looking at the connections in my newly created Flow, I see connection references instead of connections:


Which is, honestly, a very pro-dev way of calling things, but, I guess, they should have been called differently from the former connections… and there we go, there are connection references now. Still, that captures the nature of this new thing quite accurately.

It’s interesting my older Flows are still using “Connections”, not “Connection References”:


Yet, it does not matter if I am adding new actions or updating existing ones. It seems older Flows are just using connections.

This can be solved by re-importing the Flow (unmanaged in my case), though:


Not sure if there is an easier way to reset the Flow so it starts using connection references, but I just added it to a new solution, exported the solution, deleted both the Flow and my new solution, then imported it back.

By the way, I made a rookie mistakes while trying it all out. When I tried importing my new solution to another environment, I did not get that setup connections setup dialog.

This is because I should have included connection references into the solution to get it to work:


Yeah, but… well, I added my connection reference, and it just did not show up. Have to say PowerApps were a bit uncooperative this afternoon:


Turned out there is a magic trick. Instead of using “All” filter, make sure it’s “Connection References”:


Now we are talking! And now I’m finally getting connections set up dialog when importing my solution to another environment:


Although, to be fair, maybe I did not even need connection references for CDS (current environment). But, either way, even if only for the sake of experimentSmile

PS. As exciting as it it, the sad part about this last screen is that we finally have to say farewell to the classic solution import experience. It does not support this new feature, and, so, as of now it’s, technically, obsolete. You might still want to do some of the things in the classic solution designer, but make sure you are not using it for import.

For example, here is a Flow that’s using Outlook connector. I just imported a managed solution through the new solution import experience. My flow is on, and there is a correctly configured connection reference in it:


When the same solution is imported using classic experience, the flow is off:


Add intelligent File Download button to your model-driven (or canvas) apps

How do we add file download button to the model-driven apps? To start with, why would we even want to do it?

There can be some interesting scenarios there, one being to allow your users to download PowerAutomate-generated word templates (see my previous post).

That, of course, requires some custom development, since you may want to pass current record id and/or other parameters to the API/url that you’ll be using to download the file. You may also need to use different HTTP methods, you may need to specify different titles for that button, and you may need to have downloaded file name adjusted somehow.

So, building on my earlier post, here is another PCF control – it’s a generic file download button this time (which we can also use with PowerAutomate):



Unlike the earlier control, this one has a few other perks:

  • First of all, there is a separate solution (to make it easier to try)
  • Also, download url is represented by 3 parameters this time. This is in case the url is longer than 100 characters (just split it as needed between those 3 parameters) –it seems this is still an issue for PCF components
  • There is HTTP method parameter (should normally be “GET” or “POST”. Should be “POST” for PowerAutomate flows)
  • In the model-driven apps, you can use attribute names to sort of parameterize those parameters (just put those names within ## tags. You can also use “id” parameter, which is just the record id

Here is an example of control settings – notice how file name template is parameterized with the ita_name attribute:


Last but not least, this PCF control can work in two modes: it can download the file, or it can open that file in a new tab. What happens aftet depends on whether the file can be viewed in the browser, so, for example, a pdf file will show up in the new tab right away:

You can control component’s behavoir through the highlighted parameter below – use “TRUE” or “FALSE” for the value:

To add this control to your forms, just put some text field on the form, and replace out of the box control with the ITAFileDownloadButton.

The source code is on github:

And here is a link to the packaged (unmanaged) solution:

Using flow-generated word documents in model-driven apps

Document Templates have been available in model-driven apps for a while now – they are integrated with the model-driven apps, and it’s easy for the users to access them.

They do have limitations, though. We cannot filter those documents, we can only use 1 level of relationships, on each relationship we can only load 100 records max, etc.

There is “Populate a Microsoft Word Template” action in PowerAutomate. Which might be even more powerful, but the problem here is that it’s not quite clear how to turn this into a good user experience. We’d have to let users download those generated documents from the model-driven apps somehow, and, ideally, the whole thing would work like this:


So, while thinking about it, I recalled an old trick we can use to download a file through javascript:

It proved to be quite useful in the scenario above, since, in the end, here is how we can make it all work with a PCF control:


As usual, you will find all source codes on github:

For this particular component, look in the ITAWordTemplate folder.

If using the component as is, you’ll need to configure a few properties. In a nutshell, here is how it works:

  • You will need a flow that is using HTTP request trigger
  • You will need to configure that trigger to accept docId parameter:


  • After that, you can do whatever you need to generate the document, and, eventually you’ll need to pass that document back through the response action:image

Here is a great post that talks about the nuances of working with “word template” action (apparently, in my Flow above it’s a much more simple version):

  • Then you will need to put ITAWordTemplate component on the form, configure its properties (including Flow url), and that’s about it


Technically, most of the work will be happening in these two javascript methods:

public downloadFile(blob: any) {
	if (navigator.msSaveBlob) { // IE 10+
		navigator.msSaveBlob(blob, this._fileName);
	} else {
		var link = document.createElement("a");
		if ( !== undefined) { 
			var url = URL.createObjectURL(blob);
			link.setAttribute("href", url);
			link.setAttribute("download", this._fileName); = 'hidden';

public getFile() {
	var docId: string = this.getUrlParameter("id");
	var data = {
		docId: docId
	fetch(this._flowUrl, {
		method: 'POST',
		headers: {
			'Content-Type': 'application/json'
		body: JSON.stringify(data) 
		}).then(response => {
			response.blob().then(blob => {
		}).then(data => console.log(data));


Just one note on the usage of “fetch” (it has nothing to do with FetchXML, btw). At first, I tried using XMLHttpClient, but it kept broking the encoding, so I figured I’d try fetch. And it worked like a charm. Well, it’s the preferred method these days anyway, so there you go – there is no XMLHttpRequest in this code.

One question you may have here is: “what about security?” After all, that’s an http request trigger, so it’s not quite protected. If that’s what you are concerned about, there is another great post you might want to read:


PS. Also, have a look at the follow-up post which is featuring an improved version of this control.

Business rules and editable subgrids

What seems to be the most popular reason why a business rule would not be working?

There is very little that can really break in the business rules, except for one thing: we can include certain fields into the business rule conditions, and, then, forget to add those fields to the context (which can be a form, or it can also be an editable grid).

When working with the forms, we can always make a field hidden, so it won’t be visible, but it will still allow the business rule to work.

When it comes to the editable grids, though, it seems to be just plain dangerous to use the business rules.


  • Editable grids are using views
  • Those views can be updated any time
  • Whoever is updating the views will, sooner or later, forget (or simply won’t know) to add a column for one of the fields required by the business rules


And voila, the business rule will not be working anymore. What’s worse, this kind of bug is not that easy to notice. There will be no errors, no notifications, no any signs of  a problem at all. Instead, you’ll suddenly realize something is off (and you might not even know why it’s off by that time)… or, maybe, it’s the users who will notice long after the changes are in production…

This just happened to me again today – there is an editable subgrid for an entity, and that subgrid shows up on two different forms (even more, those forms are for different entities). There is an attribute that must be editable when on one of the forms, but it should be readonly when on the other form. The condition in my business rule would have to look at whether there is data in a certain lookup field, and that would only work if I had that lookup field added to the subgrid. Which means the interface would become more crowded, so the users would immediately want to get rid of that column.

Anyway, this is exactly why I removed a couple of business rules from the system just now and  replaced them with the following javascript:

function onFeeSelect(executionContext) {
var gridContext = executionContext.getFormContext();
if(gridContext.getAttribute(“<attribute_name>”) != null)


That script is now attached to the onRecordSelect subgrid event only on the forms I need.

And this should do it – no more users will be updating that attribute in the editable subgrid on that particular form.