Monthly Archives: November 2021

Using Power Automate to generate a document

This post should have been written before the one where I had a sample script which can call a flow to download a document, but, since these two posts happened to come out in the reverse order, it seems this is, technically, a prequel. Hm… If you have not seen the previous post, maybe read this one first:)

Anyways, there was a question about the actual flow, so I wanted to explain what’s happening in the flow. Here we go:

1. There is an HTTP trigger followed by a Compose action to extract record id from the query parameters

image

Why am I using “Get” method? Well, it’s just so I could test the flow directly from the browser address bar.

2. From there, I should have added a few Dataverse actions to query data for the Word Template, but I cheated. Instead, the data is “hardcoded”:

image

3. In the next action, the flow will initialize WordDoc “object” variable

image

Why do I need that? Basically, it’s because the flow can generate document in two different places. So, in both places, I’ll set the variable, and I’ll use it later in the single response action.

4. Just to make it more confusing, I have a condition in the flow where the Flow will either user Power BI Paginated Report or Word Template to produce word document

image

image

Here is the remaining piece of that scope action above:

image

Why is there such logic? It’s a demo flow, so the purpose is to demonstrate two ways of generating the document (through Power BI Paginated Reports and through Word Templates), and, then, to show that it does not change the end result.

Although, realistically, there are differences.

Power BI action will take at least 30 seconds to execute. Word Template action will work much faster. That said, Power BI Paginated Reports (which are, basically, SSRS… if you are more familiar with SSRS) can help you build pixel-perfect reports (unlike Word Templates, where precise formatting can be more complicated)

But that does not change anything for the client – whatever the flow decides to use for document generation, as long as it has that HTTP trigger, and as long as it sends Word document back to the client, the client does not care.

5. And here is how the response is sent back

image

6. The same flow would also send an email with file attachment

image

You might ask what’s that Compose action doing there – why not to use “WordDoc” variable directly? It’s because “Attachments Content” seems to expect something other than object, and, in those cases, I find that a variable might not be compatible with the action parameter type, but Compose would usually be (since it’s sort of “generic”). So, a variable is used for the Compose input. And, then, Compose output is used for the email action. Think of it as type conversion, even if a little weird one.

Hope this helps.

Record ownership across business units (aka “matrix data access structure”) – it’s here now

I’ve been diligently checking for this setting in my admin portal a few times per week lately, and, finally, it has arrived:

image

Before you proceed, you might want to read my earlier intro post for this feature:

https://www.itaintboring.com/dynamics-crm/matrix-business-units-the-cool-new-way-of-setting-up-user-security-in-microsoft-dataverse/

In this post, I’m just going to show how this works.

First of all, don’t be surprised to see progress indicator spinning there for a while once you have enabled that feature in the admin portal – just give it some time to complete.

Once this is enabled, I can go to the admin portal and start assigning security roles to the users in the specific business units. As in, I can choose a business unit, and, then, I can assign roles:

image

Compare the screenshot above to the one below, which is from the environment where this new feature has not been enabled yet, and you’ll see the difference:

image

In those environments, I can’t pick a business unit to assign a role to my user (essentially, those roles will always be assigned in the user’s BU).

What is it giving us so far, then?

Do you have users from one BU who need access to certain tables in another BU? Create a security role, configure permissions, then just assign that role to your users directly in the business unit, and they’ll get access to those tables.

You don’t need to worry about creating access teams. You don’t need to implement record sharing.  You can just start permitting users to access data in other BU-s by giving them roles in those BU-s. Essentially, we can, now, create “guest users” in various business units.

What it does not solve, yet, is the ownership issue. By default, when you have a user-owned table, a record created in that table will be assigned to the creator, and its “owning business unit” column will be set to the business unit of the owner. Which is not, always, what we might want to happen to the records created by those “guest” users – instead, we might want the business unit to keep owning those records.

This is because we might still want regular business unit users (those who belong to the BU) to have access to such records, and, from that standpoint, nothing has really changed – there are, still, the same access levels:

  • Owner
  • Business Unit
  • Parent-Child Business Units
  • Organization

image

This is where the second part of this new feature comes in, since we can add “Owning Business Unit” column to the form (or, in general, we can manipulate that column).

Note: as of Nov 22, at least in the Canada region, when a new form is added to the environment, “owning business unit” column might not get exposed. This will be fixed, but, for now, if you don’t see that column in the form designer, just use a pre-existing table/form to keep experimenting with the new security model.

Let’s add it to the form, then:

image

Now when creating a record in the model-drive app, I’ll be able to pick the BU for my new record:image

Or, if I leave it empty, it’ll be set to the BU of the owner by default:

image

That’s, basically, how it works, but, from there, we can get into all the different scenarios. For example, let’s say I wanted to assign that record to another user.

I might or might not be able to do it depending on whether that other user has permissions in the target BU.

For instance, in the below scenario I have that record assigned to myself, it’s in the “Test” BU, and I’m trying to assign it to another user:

image

Apparently, on my other user does not have permissions in the BU. Which is fine – I could give that user required security roles in the target BU and that would work even if the user belongs to another BU.

Here, let’s give that user a role in the Test BU:

image

And let’s confirm that the user belongs to the main BU:

image

The security role is only giving access in the BU, so, normally, it should not work (since the record is in a different BU):

image

But, with this new model, since the user has that role assigned to them directly in the BU, it should all be good now. Let’s try:

image

Oops… is that an error again?

The problem now is that, by default, records are automatically moved to the owners business unit. In order to change that behavior, we need to update organization settings and set “AlwaysMoveRecordToOwnerBusinesUnit” to false

image

Those cascading behaviors are described in more details here:

https://docs.microsoft.com/en-us/powerapps/developer/data-platform/configure-entity-relationship-cascading-behavior#about-the-assign-action

With that done, I can finally save my record so it’s assigned to a user, and, yet, it still belongs to a different business unit:

image

Do we want to always change that setting so that the records are no moved to the owner’s BU? Guess it depends, but you’ll need to keep this in mind when designing your security model. After all, that setting is “all or nothing” – it can’t be done per business unit/per user. So, an alternative might be to set it to “false” for the org, and, yet, to use a flow/plugin to maintain default/legacy behavior for some of the business units, for instance.

Well, I’m pretty sure there are other edge cases and caveats we may need to think about when working with this new model, but I’m hoping this was useful so far to get you started.

Would also recommend this video by Paul Liew: https://www.youtube.com/watch?v=NBBYinF9B7g&feature=youtu.be

Or yet another one by Scott Durow: https://www.youtube.com/watch?v=dVGklfmVr6s

Have fun!

How to: add a ribbon button that calls a Power Automate flow (and downloads generated document as a result)

If you ever wanted to add a ribbon button that would be calling a flow (one usage scenario for this is to start sing Power BI Paginated Reports and/or Power Automat Word Templates to generate documents/reports), there is a sample script below which you could use to achieve just that.

Here is how the user experience would look like with this script in place:

flowribbon

Here is what’s happening in the script:

  • The script will start by querying the value of the environment variable (which is called ita_documentgenerationflow – you’ll need to change the name of that variable to match your environment). That variable stores HTTP trigger link for the flow to be executed
  • The purpose of this variable is to support DEV->TEST->PROD process, since, once the variable is set properly in those environments, the script will use it to execute the flow
  • Once the script has that url, it will add record id as a parameter, and it will use “fetch” function to call the flow
  • In the success callback it will, then, download generated document

Anyways, here is the script:

function callHTTPFlow(primaryControl) {
	"use strict";
	debugger;
	Xrm.Utility.showProgressIndicator("Generating document...");
	getEnvironmentVariable("ita_documentgenerationflow", 
	    function(url){
			callHTTPFlowInternal(url, primaryControl);
	    },
		function(error){
			handleError(error);
		}
	);
}

function handleError(error)
{
	Xrm.Utility.closeProgressIndicator();
	showMessageDialog("Error", error);
}

function callHTTPFlowInternal(url, primaryControl) {
	"use strict";
	var fileName = "Summary.docx";
	var documentId = primaryControl.data.entity.getId();
	primaryControl.data.save().then(function() {
		
		documentId = documentId.replace("{", "").replace("}", "");
		url = url + "&id=" + documentId;

		fetch(url, {
			method: 'GET',
			headers: {
				'Content-Type': 'application/json'
			},
			body: null //JSON.stringify(data) 
		}).then(response => {
			Xrm.Utility.closeProgressIndicator();
			if (response.status == 200) {
				response.blob().then(blob => {
					downloadFile(blob, fileName);
					closePopups(primaryControl);
				});
			} else {
				response.text().then(body => {
					handleError(body);
				});
			}
		}).then(data => console.log(data))
		.catch(function(error) {
			handleError(error);
		});
	},
	function(error) {
		Xrm.Utility.closeProgressIndicator();
		showMessageDialog("Error", error.message);
	});
}

function closePopups(formContext) {
	formContext.data.refresh(false); 
	Xrm.Utility.closeProgressIndicator();
}
	
function downloadFile(blob, fileName) {
	if (navigator.msSaveBlob) { // IE 10+
		navigator.msSaveBlob(blob, fileName);
	} else {
		var link = document.createElement("a");
		if (link.download !== undefined) {
			var url = URL.createObjectURL(blob);
			link.setAttribute("href", url);
			link.setAttribute("download", fileName);
			link.style.visibility = 'hidden';
			document.body.appendChild(link);
			link.click();
			document.body.removeChild(link);
		}
	}
}

function showMessageDialog(messageTitle, message) {
	var alertStrings = {
		confirmButtonLabel: "OK",
		text: message,
		title: messageTitle
	};
	var alertOptions = {
		height: 120,
		width: 260
	};
	Xrm.Navigation.openAlertDialog(alertStrings, alertOptions);
}


function getEnvironmentVariable(varName, onSuccess, onError){
  "use strict";
   Xrm.WebApi.retrieveMultipleRecords("environmentvariabledefinition", "?$select=defaultvalue,displayname&$expand=environmentvariabledefinition_environmentvariablevalue($select=value)&$filter=schemaname eq '"+varName+"'").then(
		function success(result) {
			var varValue = null;
			for (var i = 0; i < result.entities.length; i++) {
			    
				if(typeof(result.entities[i]["environmentvariabledefinition_environmentvariablevalue"]) != "undefined"
				   && result.entities[i]["environmentvariabledefinition_environmentvariablevalue"].length > 0)
				{
				   varValue = result.entities[i]["environmentvariabledefinition_environmentvariablevalue"][0].value;
				}
				else if(typeof(result.entities[i].defaultvalue) != "undefined")
				{
				   varValue = result.entities[i].defaultvalue;
				}
				else{
				   varValue = null;
				}
			}    
			onSuccess(varValue);	
		},
		function (error) {
			console.log(error.message);
			onError(error);			
		}
	);
}

And here is how the button is configured in the ribbon workbench:

image

And here is one more screenshot just for completeness – this is how you could configure response action of your flow to work with the script above:

image

Nov 23: if you wanted to see a more detailed example of the flow, have a look at this post

Another Date column, and that’s yet another kaboom

It happens every now and then that I have to re-create a columns across multiple environments, all because of the data type change. Sometimes, there are good reasons for that, and, sometimes, all I can do is kick myself for not thinking it through from the beginning; since, of course, changing data type for a a field that’s being used in various plugins/flows/reports is not a small feat.

And another thing I can do in such cases is write up a blog post – it’s sort of a “lessons learned” one.

If you prefer visuals, let’s start with this quick diagram – there is a nice kaboom right at the bottom:

image

You see, I actually wrote about date formats in Dataverse before:

https://www.itaintboring.com/dynamics-crm/user-local-behavior-with-date-only-format-whats-that-for/

Wait, I did it at least twice:

https://www.itaintboring.com/dynamics-crm/dates-in-dynamics-timezone-independent-vs-user-local/

And, yet, here we go – there is another kaboom.

Here is the problem with timezone independent date only columns:

  • If you had such a column
  • And if you set a value there from your plugin
  • You’d usually use something like DateTime.UtcNow to set the date in C# (since you are supposed to be using UTC dates)
  • However, timezone independent columns don’t care about whether the date is in UTC or not – it’s just the “date” that matter
  • And, of course, if you are in EST, and if it’s 8 PM, then, when using DateTime.UtcNow, you’ll get “tomorrow” is your date. When you were probably expecting to see “today”

Problem is, this is all happening in the plugin, so you could, of course, use DateTime.Now, but it might be in the wrong timezone, too.

And this is all while timezone-independent behavior works nicely when column value is set directly on the client, since, on the client side, users will be setting the date consciously, so they’ll see what date they are using (unlike in the plugin, where date portion of UtcNow can be different from the user’s date because of time zone adjustments).

So, what should I do?

In my case, “User Local” behavior might, actually, work better in this scenario. The date is set automatically, so the only date I can rely on in the plugin is UtcNow, and, if I used “user local” behavior, that date would be adjusted to the user’s timezone when it’s displayed to the user. Which, in this case, is exactly what I need.

Of course I’d need to re-create that field now, which is going to be a pain in the… yep… but, well, off I go to work on that.

Pay-as-you-go VS consumption-based billing

I like how Microsoft keeps introducing changes to the Power Apps licensing – it’s good to see prices going down, API limits going up, and new licensing options being introduced in addition to the existing ones.

So, what’s the story with that new “pay as you go” plan?

https://powerapps.microsoft.com/en-us/pricing/

image

It’s an interesting addition to the plans, but the key to understand is that it’s not, really, “resource consumption” billing. As in, when you are using other Azure services, you are billed for CPU/memory/requests/etc. With the “pay as you go” Power Apps plan, you are billed when a user opens an app, and you are billed as if that user were using the app for the whole month:

image

(Have a look at https://docs.microsoft.com/en-us/power-platform/admin/pay-as-you-go-meters?tabs=image for more details)

In other words, the way I understand it, this might work for the users who would be using a single app every second or third months. Because, if those users were going to keep using the app monthly, it would be better to give them an app pass ($5 per licence).

Or, if such users were going to use at least 2 apps monthly, it would be better to give them a user licence ($20).

If it were hourly or daily billing, I guess it would have been different, but, since the billing is monthly, the benefits are somewhat limited, it seems. You really do need to have a usage pattern where your users are not going to be coming back to the environment to use that single app every month, and I would not be too sure how to do that, given that it’s exactly that unpredictability which is supposed to be addressed by utilizing “pay as you go” plans.

And this is where, if Microsoft were to introduce a real consumption model, it seems that would be much more useful. As in, what if we could pay for the CPU usage, storage, and traffic? I’d think since Power Platform is built on top of Azure, there should be a way…  There would be no need for the API limits, no need for the monthly billing, etc. That is probably the future anyways, but, right now, we have a new plan available, so let’s see if and when it’s going to be beneficial to the clients.

Increased request limits and allocations: 25K –> 40K, and 100K -> 500K

Well, good things just keep coming in. API limits have been increased quite a bit, and, it seems, it just happened yesterday:

https://docs.microsoft.com/en-us/power-platform/admin/api-request-limits-allocations

It’s, now, 40000 calls per Power App / D365 user:

image

And, for the non-interactive/application users, it’s 500000 per day pooled:

image

When did this happen? Looking at the git sources, it seems this happened yesterday (Nov 1). Just in time for Microsoft Ignite, I guessSmile

image

Configuring the maintenance window

Now that we have the ability to configure maintenance window settings in the production environments, what would we use it for?

image

The wording there is quite interesting – if there is no downtime or service degradation, why would we care?

Well, the answer seem to be two-fold.

On the one hand, from the operations perspective, there should be no service degradation. As in, your users are supposed to be able to keep working in the environment without being impacted by whatever maintenance activities that might be happening in the background.

On the other hand, from the development and deployment perspective, have you ever tried publishing multiple solutions at the same time in the same environment? You might have seen the error below, then:

image

So, in theory, by having that maintenance window configured, you can, at least, guarantee that your own deployments won’t be happening at the same time when Microsoft might be doing theirs, and, in that sense, your deployments won’t be interrupted because of Microsoft first-party solutions being deployed at the same time.

You could, also, ensure that your business users don’t run into changes in the middle of the day if your organization operates in different time zones. In that case you could configure every production environment for its own maintenance window (can’t do it for sandboxes). Although, it’s not likely that you’d be doing automated testing at the end of the maintenance window, yet the users would still run into those changes the following morning. Does it matter if that happens in the morning or during the day? Maybe.

Not sure what other use cases would be available right away, but, I guess, more are coming in the future.

You can read more on this new setting in the docs:

https://docs.microsoft.com/en-us/power-platform/admin/manage-maintenance-window

Matrix Business Units – cool new way of setting up user security in Microsoft Dataverse

Have you noticed that there is a new feature that’s been added to the security model in Dataverse, and it’s called “Matrix Business Units”?

https://docs.microsoft.com/en-us/power-platform/admin/wp-security-cds

image

It is in preview, and it is being rolled out. To be honest, it’s not, yet, available in the environments I have access to, so I’m just waiting to see it there, but, based on the private preview experience, this is going to be of great help in may cases.

Because, you see, once it is available and enabled, we’ll be to achieve the following:

  • Keep our users in any BU
  • While giving them access to other BU-s by assigning roles in those BU-s
  • Assign records to the users, while keeping the same records associated to just about any BU in the system

For example, imagine two different business units with different groups of users. You might want each group to see records associated with their respective BU, but you might still want to allow a few users from another BU to have access, too.

Here is what might happen then:

  • If a user from another BU were to create a record, that record would be associated with the creator’s business unit (since, by default, creator becomes the owner)
  • Which is still going to be the case, but you will now be able to change the BU that new record belongs to while still keeping it with the original owner

That would allow the user who created the record to have full access to it, while also allowing all users from the BU with which that record is associated to have access to it through the security roles

That, however, does not change the basics of owner-based / bu security. If a record belongs to a business unit, anyone who needs access to that record should have at least some role in that BU that gives them access to the record (even the owner of the record).

The whole point, though, is that now we’ll be able to give roles in the BU-s without having to add users to teams, and, also, we’ll be able to decouple “owners” from “owning business units”.

Anyways, I’d be happy to share screenshots, but, given that private preview might have been somewhat different from the public preview, I’ll hold off and wait till this has been rolled out.

So, then, this post is *TO BE CONTINUED*

Nov 22: Here is part 2 of this post