Monthly Archives: December 2019

Azure Architecture and Power Platform

I’ve been trying to catch up on the azure architecture lately using free learning material that Microsoft provides for the related az-300 exam:

There is a lot to catch up on, since it’s definitely not my primary area of expertise, but now that I’m through about a quarter of that course, I can’t help but start thinking about how that relates to the Power Platform/Dynamics.

Quite frankly, it seems that, even if the concepts discussed there are still applicable, technically Power Platform is very independent from Azure. It might be running on the Azure backbone, but, from the end-user and/or administrator standpoint, there is not a lot of control over how exactly it’s running there. Which is good and bad, as usual.

On the one hand, it’s up to Microsoft to ensure that Power Platform is running smoothly, so we, Power Platform users/admins, don’t need to worry about it.

On the other hand, Power Platform architecture essentially denies access to some of the Azure concepts. For example:

  • Power Platform environments are tied to the regions. If there is any fault-tolerance embedded there, it’s not exactly clear how it works
  • There is no load-balancing, health-probing, or traffic management. More exactly, they are not within our control. Although, I’m guessing traffic management might still be possible, but it would not make a lot of sense since we can’t do CDS database replication between regions. Besides, there would be licensing implications
  • With the SLA-s, it’s not clear what is really guaranteed


Actually, when it comes to the SLA-s, it’s very interesting in general. I used to think SLA is sort of an uptime guarantee. And this is how it is described in the architecture courses. But, come to think of it, it’s more of a “money-back” guarantee. For a lot of Microsoft products, you will find corresponding SLA-s in this document:

As far as Power Apps go, here is what it looks like:


Strictly speaking, in terms of service availability there is just no guarantee. It’s simply common sense that Microsoft would want to hold on to the subscription payments rather than to reimburse its  clients for the service degradation. Although, that reimbursement would be limited either way.

In other words, there is an SLA, but, getting back to the architecture in general… let’s say we are building an application that is going to utilize CDS web api-s, and we want to guarantee 99.9% uptime for that app. We can keep adding load-balancers, availability sets, etc. But we can’t do better than the system we depend on, which is CDS in this case. Problem is, Power Platform subscription costs might not be that big of a component in the overall cost of our application downtime.

This has actually been my main “disagreement” with the whole ADX Studio architecture from the early days, and I am still not that convinced Power App Portals are much better in that sense. Although, I have to admit Power App portals are running in Azure, yet they are managed by Microsoft., and Microsoft likely has more tools and experience to maintain and operate them compared to the majority of individual clients who used to install ADX on-premise.

Either way, even though a bunch of things are out of our control in the Power Platform world, there is still quite a bit that’s on us:

  • Backups and disaster recovery. Technically, backups are supposed to be included into the disaster recovery plans… however, in case with Power Platform it’s not quite clear whether we can have any disaster recovery plan other than putting our trust in Microsoft and hoping there is a plan. There are database backups, though, so we can use those backups to restore our CDS databases if, somehow, the data gets broken there. On the other hand, Power Platform is not tied exclusively to CDS – there can be other data sources involved, so backups procedures for those other datasources can be quite different
  • Did you know you can use “Express Route” to connect your network to the Microsoft Cloud?  This is how you can get some extra security and lower latency, although, of course, it’s not a free service. Still, it might speed up(and secure) access to the Microsoft cloud in general and to the Power Platform in particular for your internal users
  • Data security in CDS. That’s never been particularly simple, but, with the introduction of canvas apps, excel online data editing, power BI, etc… it’s probably easier than ever to miss something in the security configuration and unintentionally expose data. Data security deserves a separate post, though

Well, this has not been a very coherent post – instead, it’s probably just a reflection on what I’ve been reading about lately. But there is one good topic to explore further, which is the security, and this is likely what I’ll get back to in one of the following posts.

“Default” property in the Canvas Apps controls – there is more to it than the name assumes

This comes straight from the Power Apps documentation:

Default – The initial value of a control before it is changed by the user

Actually, I never bothered to read that definition until recently, and, it seems, there is some discrepancy there.

That definition seems to imply that “Default” will only affect your control’s initial value, but it’s not the case when “Default” is sourced from a variable. Instead, every time the variable is updated, the value you see displayed in the control will be updated as well, even if the user has already changed that value by typing something different into the control.

Here is an example:


What’s happening there is:

1. I have a text box which will update a variable in the OnChange


2. And I have another text box which will source “Default” property from the global variable above


3. Every time the variable is updated through my first text box, my second textbox picks up that updated value. Even after I have typed something different into that text box

Either way, that is a very useful behavior. Otherwise, how would I even “reset” my controls if I wanted them to reflect those updated variable values? But it’s definitely more than just “the initial value”.

PS. A few hours after writing this blog post, and the proof has been found that’s a “by design” behavior:)

“Input controls are also reset when their Default property changes”


Working with custom connectors – a few observations

For some reason, I got hooked up on the custom connectors for the time being. It’s a bit of a learning curve for somebody who has not been doing a lot of development outside of the plugins/scripts/occasional .NET  for a while, so, if nothing else, it’s a good exercise in re-building at least some of those development skills.

Interestingly, the learning here is not focused on the development only. Custom connectors are closely tied to PowerPlatform, and, besides, my Web API has to be hosted somewhere, so this involves building a Web API, but this also involves figuring out how to host it in Azure (in my case), and how to set up a connector in PowerPlatform.

Hence, in no particular order, here are some of the observations so far.

1. Creating a web API in the Visual Studio is very straightforward


Once you have a project, you may want to remove everything other than the Post method, and you may also want to update the route:


Then you just need to publish it somewhere, and, of course, publishing to Azure is easy:


You may want to look at the more detailed tutorial here, though:

2. Creating a swagger file (or OpenAPI file) is more involved

That file is, really, just a description of your API. While creating a custom connector in PowerAutomate/PowerApps, you can feed that file to the custom connector “wizard”, it will parse it, and you won’t have to do a thing manually at that point.

But, of course, you may actually want to create that file AFTER you have an API. Or you may even want to generate it automatically.

This is where a couple of tools might help.

a) The post below provides instructions on how to generate swagger files for your web api projects

However, once the file was generated, I still could not use it to create a custom connector since some of the information was missing from the file

b) Swagger editor might help at that point

I added a few tags to my files (“host”, “basePath”, “schemes”, “consumes”, “produces”). Not sure all of them would be required, but pretty sure PowerPlatform expects at least the “host” information to be there (since that’s where I was getting an error).

3. Enabling authentication for your web api (in Azure)

This turned out to be a more complicated story for some reason, and I’m still trying to figure it out. Web API would be hosted as an app service, and it was not that complicated to enable authentication there. What has proven to be more challenging is setting it up so that users from other Azure tenants could use my web api.

Firsrt of all, that requires a custom domain. And, if there is a custom domain, it needs an SSL. And, if there is an SSL, I need a more expensive app service hosting plan. But, even once I had done all of that, I was still getting an error when trying to utilize my Web API with an account from another tenant, since, somehow, I was still required to add that user as a guest first. Anyway, that’s the bulk of it, and, it seems I’ll need to get back to the authentication part.

For now, there is no authentication on my web api.

4. It’s the second time when I’m observing errors in while is working fine

It happened with the UI Flows before:

And it also happened this time when I was trying to update my custom connector. Turned out there is a related recent community thread, so, it seems, it’s just my luck that I’ve started working with custom connectors just about the same time when this problem was reported:

Anyway, in my case switching to has helped in both situations.

5. While in the “test” mode, custom connectors don’t seem to recognize arrays

There is an array parameter in my connector. It works fine when using “raw body” option to adjust json:


However, once in the “regular” mode, there seem to be no way to turn that parameter into an array – it would only accept one element no matter what:


Still, when using that connector in the actual Flow, I can set up an array:


And I can pass that array through the action parameter:


Either way, so far web api source code is on github:

There is a related swagger file you can use to create a custom connector in PowerPlatform:

The API is hosted in Azure on a shared plan – you can try it, but don’t expect much in terms of uptime/reliability:

Both methods will expect a post request.

Regex is, well, regex. More details here:

The other one (addbusinessdays) will take the starting date, an array of holidays (see screenshots above), and the number of business days to add to the starting date. It will, then, add those days to the starting date having accounted for Saturdays, Sundays, and all the holidays on the list.

PowerPlatform: beyond the custom code

As far as adding custom code to PowerApps/PowerAutomate goes, I’ve looked at three different options so far:


Those are all valid options, but there are things to consider which go beyond purely technical aspects. For example, even though some Office 365 licenses include Power Apps use rights, those licenses will not allow access to the custom connectors.

So, how do we compare those three options?

It might be worth looking at the following 7 aspects – there is probably more to compare, but that should be a good starting point:


Let’s look at each of those boxes one after another.

1. Data Security

For this one, I mostly wanted to look at it from the perspective of CDS security roles.

When setting up an Azure Function that would be connecting to CDS, we would likley utilize an account (be it a user account or an application account), and that account might be different from the account of the user who is running the Flow/utilizing the PowerApp. Which means there might be quite a few security issues there since those two accounts might have different levels of data access.

CDS Custom Actions, on the other hand, would be utilizing the user account specified in the Flow connection, and that would likely be the user account of the Flow creator. That would be a somewhat more consistent. Moreover, absolutely no effort would be required from the CDS custom action developer to achieve this.

As far as custom connectors go, it seems I don’t have enough experience there to be sure. On the one hand, we can set up authentication for the custom connectors. On the other hand, I am not sure if/how we can reuse those connections from within the custom connector code to open subsequent connections to CDS from code.

As far as data security goes, at least in relation to CSD, it seems CDS Custom Actions would have a bit of an edge.

2. Data Loss Prevention

Azure Functions, unless they are wrapped up into custom connectors, will work over the out of the box HTTP connector.

CDS Custom Actions will work over CDS connector.

From the DLP perspective, there is no way to separate one Azure Function from another or one CDS Custom Action from another.

Custom Connectors, on the other hand, can be added to the DLP individually:

We can wrap up different API-s into different custom connectors, and, depending on the needs, we can add those connectors to the DLP as required.

From this standpoint, Custom Connectors look better.

3. Code Hosting

Azure Functions are hosted in Azure. That kind of “hosting” is easy to set up, but it’s somewhat limited and is probably not meant to create really complex API-s

CDS Custom Actions are hosted in CDS. Which means you need CDS to start with. Which also means custom actions are tied to the CDS environment. Which allows for the DEV-TEST-UAT-PROD scenario, but, on the other hand, which might not be the best option when you need to host some kind of shared API. Also, just like with the Azure Functions, CDS custom actions are not really meant to serve as advanced API engine.

Custom Connectors are hosted… technically, it’s the API which is hosted. It can be hosted in Azure, or it can be hosted on some other servers. The advantage is that you can go as complex as you want with those API-s, but it’s also a disadvantage since you have to figure out deployment, lifecycle, etc.

The way I see it, there is no clear winner in this category. CDS custom actions work really well when your API is supposed to be tied to a specific CDS environment. Azure Functions work great in the non-CDS scenario where you don’t need complicated code. Yet with the custom connectors you can build something really advanced, but that comes with the additional deployment and configuration complexity.

4. CDS Solution Awareness

What if you wanted to move your custom code from one CDS environment to another? Of course the question itself assumes that such code would be environment-specific somehow. It might be because it is supposed to work with that particular CDS environment, or it might be because if has to mirror the same release process (Dev-Test-Prod).

Azure Functions have no idea of what CDS solutions are, so they are not competing in this category at all.

CDS Custom Actions live in CDS, they can be added to the solutions, so it’s their natural environment.

Custom Connectors can be added to the solutions, though I am wondering what it really means. You can add the connector, but you can’t add the API, so what exactly are you achieving by doing that?

Either way, in terms of CDS solution awareness and in terms of our ability to mimic CDS solution deployment process for custom code, CDS custom actions will definitely be ahead of the other two. They do take this one.

5. CDS Integration

This one is likely going to the CDS Custom Actions, too. Even if only because CDS is right there, in the name.

But, seriously, when it comes to CDS custom actions, we can write plugins and we don’t have to worry about authentication and/or about utilizing web api etc. All those SDK assemblies are there, so we can build code easily.

This is not the same for Azure Functions and Custom Connectors, even though we can always add references to the same SDK assemblies and set up the connections from code. But, then, those connections may have to account for different connection strings depending on whether we are working with Dev/Test/Prod, and how do we do that properly… that’s not a problem for the plugins at all – they just don’t need to worry about it.

6. Licensing

It’s better not to talk about licensing, but it’s also one of those topics which is just unavoidable.

First of all, whether it’s an HTTP connector (for Azure Functions), a Custom Connector, or a CDS connectors, those are all premium connectors. Which means you do need appropriate license to use them in PowerApps/PowerAutomate.

Other than that…

Azure Functions: there are tiers, but, essentially, it’s “pay per use”. Although, there is a caveat. If an Azure Function is not connecting to CDS, then that’s what it is. If it is connecting to CDS, then we also need a license and/or API usage add-on for CDS. Besides, since we will be using an HTTP connector

Custom Connectors: depending on where they are hosted, additional licenses/fees might be involved.

CDS Custom Actions: even if you are using them for something like “regex” validations, each custom action call is still considered a CDS API call, and there are limits on how many calls are allowed per license/add-on.

Is there a winner? I think Custom Connectors offer more flexibility, so will give it to them.

7. Other

Custom Connectors can be shared, and the can also be certified and made available to the users in other tenants. For the Azure Functions, best we can do is share the code. For the CDS Custom Actions, we can package them as solutions and share with other CDS customers.

Logic Apps do not support CDS (Current Environment) connector, so using CDS custom actions from Logic Apps might be more involved than using CDS Custom Actions from Power Automate Flows.

From the usability standpoint, Custom Connectors are, likely, the easiest to consume in the Flows/PowerApps. Azure Functions require json parsing etc. CDS Custom Actions – they seem to be somewhere in between.


I don’t think there is a clear winner for all situations. I would not even say CDS Custom Actions work best when we are talking about CDS environments. Even more, I am not sure I have not missed something above that would turn everything on its head. But, like I said, this might be a good starting point.

Have fun!

Here is a riddle: “I am a readonly record, but I am still updatable in the user interface. What am I?”

Have you ever noticed there is at least one entity in the Model-Driven apps (and in Dynamics before) which would sometimes claim a form is read-only, but which would still be somewhat updatable in the user interface?

Even more, this peculiar behavior may not be easily controlled through java scripts.

See, you can update “Regarding” field on the completed emails, even though the form will be telling you that the record is read-only:


What will happen as a result is that you’ll see “unsaved changes” notification in the status bar:


Even though you won’t see the usual “save” button there.

However, eventually “Autosave” will kick in and updated “regarding” will be saved. Or you could also use CTRL+S to save the changes right away.

That seems to be a bit of user-interface inconsistency, but there is a good reason for why “regarding” is not made read-only (even if the implementation feels more as a workaround). When an email comes in, and if it does not get linked to the right record, you may still want to change “regarding” on such an email even though it’s already been marked as “received” (or, possible, as “sent”).

One might argue that it’s no different from how other entities works, and we just need to re-activate them in such cases. However, it’s a little more complicated with emails since we can’t easily reactivate an email (I guess this is because, otherwise, it would turn into a mess really quickly if somebody tried to send an email that had already been sent etc)

Custom connector: where PowerAutomate makes peace with Logic Apps

Remember this screenshot?

Actually, other than Azure Functions and CDS custom actions, there is at least one other option in Power Platform which we can use to add custom code to our Power Automate Flows and/or to our Power Apps.

Those are custom connectors.  We can also use custom connectors with Logic Apps, so this is where all those Azure technologies are becoming equal in a way. Although, while Flows and Power Apps can only use REST API-s, Logic Apps can also use SOAP web services. Which gives Logic Apps a little edge, but, well, how often do we use SOAP these days?

Either way, the problem with custom connectors is that creating them is not quite as simple as creating an Azure Function or a CDS custom action.

Here is how the lifecycle of custom connectors looks like:



The last two steps on this diagram are optional. As for the first three, the reason those first 3 steps can be quite challenging is that there are various options we have to create an API, to secure it, and to host it somewhere.

Still, what if I wanted to create a simple custom connector to implement the same regex matching that I used in the previous posts for Azure Functions and CDS Custom Actions?

I could create a Web API project in the Visual Studio. There is a tutorial here:

In the remaining part of this post, I’ll show you how it worked out, and, if you wanted to get the source code for that regex Web API from github, here is a link:

Essentially, it’s the same regex code I used for the Functions and/or for the CDS custom actions:


I can try this in Postman(hiding the actual link since, depending on where I leave it, it might not be protected with any kind of authentication. You can just publish that web api from github in your tenant to get your own link):


And the result comes back (although, compared to the other versions, it’s now in json format):


Let’s turn this into a custom connector?

There is a tutorial here:

But, either way, let’s see how to do it for the regex connector above.

In the power apps maker portal, choose custom connectors area:


Creating a connector from scratch:





When importing from sample, make sure to specify full url. This feels strange, since I would assume with the base url specified before there would be no need to specify complete path to the api below, but it just has to be there. So, here we go (nothing goes to the headers btw):


With the response, there is no need to provide urls etc – just provide a sample response:


Once the response has been imported, for some reason nothing actually changes on the screen – there is no indication that a response has been added, but it’s there:


You can click on that “default” above, and you’ll see it:


Actually, the connector is almost ready at this point and we just need to create it:


And then it’s ready for testing:


When creating a new connection above, you will probably find yourself taken away from the “custom connector” screens. So, once the connection has been created, go back to the “custom connectors” areas, chose your connector, open it for “edit”, and choose the newly created connection:


Then we can finally test it:


And we can use this new connector in the Flow:


Apparently, it works just fine:


But what if I wanted to add authentication to my API? Since it’s hosted in Azure as an app service, I can just go there and enable authentication:


I can, then, get everything set up through the express option:


Save the changes, and it’s done!

Sorry, just joking – not really done yet.

The connector needs to be updated now, since, so far, it does not know that authentication is required now.

In order to update the connector, I need to configure the application first. The application will be there under “app registrations” in the Azure Portal – here is how it looks like in my case:


There is, also, a secret:


With all that in place, it’s time to update connector settings.

First, let’s make it https:


Here is how connector security settings look like:


Application ID (client ID) from the app registration page in Azure Portal goes to the Client ID field. Secret key goes to the Client secret field. Login URL and Tenant ID are just the way they are.

Resource URL is set to the same value as Client ID.

Then there is that last parameter which must be copied and added to the redirect urls for my app registration in Azure Portal:


Now it’s actually done. Once the connector has been updated and a new “authenticated” connection is created, I can retest the connector:


It works… so I just need to update my Flow above (it will require a new connection this time), and retest the flow.

It may seem as if it was quite a bit more involving than Azure Functions or CDS custom actions. But it’s probably just a matter of perception, since, come to think of it, it’s my first custom connector, and I had to figure out some of those things as I kept going.

More to follow on this topic, but enough for now. Have fun!

FetchXml powers turned out to be limited, and I’ve just discovered it the hard way



That’s just how many linked entities you can have in FetchXml.
I guess I have never needed more than this. That’s until today, of course:


Actually, this is not how I discovered it. I was writing an SSRS report and the number of linked entities in my FetchXml query kept growing, so at some point the reports has stopped working:


That error message made me try my Fetch in the XrmToolBox, which lead to the error above, which, in turn, made me look at the documentation again… and there it is:


It seems the limitation has been there forever, but it’s only been added to the docs recently:


So I’ll probably have to make the report work somehow else. Might have to start using subreports instead of bringing all the data through fetch…





Early transition to the UCI – possibly a false alarm yet

We all know that by October 2020 classic web client will be retiring, and UCI interface will take over everywhere where the classic web client might still be reigning at the moment of writing this post.

This can be a very sensitive topic, though, and it can be quite confusing, too. As mentioned in this post, it seems Microsoft is now scheduling the transition for early 2020, and, quite frankly, that may scare the hell out of anybody in the community.

So, I just wanted to clarify something. From what I understand, this early transition is not the same as getting rid of the classic solution designer or settings area. There is a bunch of environments I work with which have already been transitioned:


This screenshot is coming directly from the runone portal ( ) where you can review the environments and schedule/postpone the updates.

I can still do all my administrative tasks and solution configuration in the classic interface in that transitioned environment:


What I can’t do – I can’t work with the actual applications in the classic interface in those environments anymore.

In other words, what this change will bring over is “UCI for the end users”, but not yet “UCI for the admins”. Mind you it’s not necessarily making this easy for the end users, but we have all been warned a while ago, and the clock is definitely ticking very loud now, but, at least, I don’t think we should be concerned about losing the ability to use classic solution designer or to create/update classic workflows with this early transition in 2020 (which might be in preparation for the eventual “full” transition later in the year)