Monthly Archives: July 2021

Bulk deletion seems a bit fishy

Have you tried bulk deletion lately? You may have seen the error below when you tried:

image001

Well, don’t you worry. Just wait a minute… I mean, literally. Although, you may have to wait a few minutes to be honest, but the key is that you need to wait a bit and try again. It might just work on your next attempt:

image

How come? Who knows. Apparently, bulk deletion looks a bit fishy:

10 of the World's Most Dangerous Fish | Britannica

(The fish above came here from https://www.britannica.com/list/10-of-the-worlds-most-dangerous-fish)

Earlier today, I was crying for help and Nick Doelman offered the workaround above – thanks Nick!(although, come to think of it, a better one might be “take a coffee break and come back”Smile)

Now that everyone can quickly build and share low-code apps – will everyone do?

“Now everyone can quickly build and share low-code apps with Microsoft Power Apps” – this is the slogan you see when you open https://powerapps.microsoft.com/en-us/

image

This is how Microsoft sees Power Apps, and, in the bigger schema, this is what Power Platform is all about – democratizing development by providing the tools everyone can use. With Power Automate, we can connect to almost everything almost everywhere, and, then, we can perform actions on those connections without writing a single line of code. With Power Apps, we can use the same connectors, but, this time, we can create UI experiences without writing any pro code.

Naturally, when looking at it that way, the next logical step would be to encourage business users to start writing applications for themselves since, of course, they are the ones who know how businesses operate, and, from that standpoint, they have obvious advantage over developers who might be able to use pro-code, but who would not be able to foresee all the peculiar scenarios their applications may have to cover because of how businesses are supposed to operate.

Traditionally, application development would be organized around a project team, and that team would be expected to follow some kind of delivery methodology – it could be agile/scrum with all the user stories, it could be waterfall with all the business requirements, or, possibly, it could be something else. However, all of those would, normally, assume that business requirements would be captured, this way or another, before development starts. This would be to ensure that all those peculiarities of the business process were clearly explained to the developers, and, therefore, to eventually develop applications that provide expected functionality.

With the introduction of low-code, which has brought application development within the reach of the actual business people, where is this all going to go? Is it, really, that everybody in any organization will start building low-code apps or is there more to it?

Personally, I believe there are at least a few things to keep in mind:

  • Business users are not going to become low-code developers just because they can
  • For any organization to stay manageable, there should be some consistency in the data it’s collecting from various sources and in the processes it’s following
  • There are certain rules an organization may need to follow, and those rules may have to be enforced (think of various data protection laws, for instance)

I will talk about this more below, but, just by looking at the list above, you can probably see what I’m going to arrive at, and, basically, it’s that an organization may have to meet certain conditions to really start benefiting from the democratized application development.

Why so?

First of all, if you think of the business folks, they have their immediate operational responsibilities, and, often, their performance might be measured by certain metrics. Those have nothing to do with application development – this holds true for the sales agents, this holds true for the CEO-s, and this holds true for the business owners.

Of course some of them might be interested to jump into the application development since this would be something they always wanted to try, but there we are talking about people trying another hobby. For those turning to the low-code tools to improve their personal efficiency, there will always be a very interesting dilemma (I find it surprising “improving personal efficiency” is often touted as a benefit, since it’s sort of ignoring the obvious): you can become more efficient, but, once your secret is revealed to your peers, they will all reach the same level of efficiency, and you will all be equal, again. What’s the point? It seems obvious, so why would anybody other than those who have been given some incentives push for the personal efficiency? Of course business owners would be naturally incentivized to improve personal efficiency of their employees/contractors. People on commission might also see benefits in becoming more efficient, even if that end up being only a temporary boost in payments.

However, for most of the business folks, unless, again, they were always dreaming of doing this, the idea of becoming a low-code developer might seem far from what they would really want to be doing on their spare time.

And there is no judgement there – after all, not that many pro-developers would want to become sales agents, right?

Although, there could be an alternative (or complimentary) approach where organizations will start encouraging employees to spend time on citizen development somehow. Possibly, they will, but, this way or another, for the business users to start using low-code development tools, they must be willing to do so. In other words, everyone probably can start developing apps now, but not everyone will.

But that would still be only the first step. Imagine everybody jumping into it and starting to develop all sorts of low-code applications. For everyone involved, it might turn into a really interesting experience, but, in the end, if some of those applications start storing their data in Excel, other are going to start using personal one-drive, and yet others would opt for and Azure SQL database, this will become a nightmare for the organization. Since, after all, you need to ensure the data produced by all those apps is available and manageable somehow. Otherwise, what’s the point in having that data?

But, even when you have manageable data, you can’t really allow one sales person to start following a process that’s completely different from what another sales person would be doing. Mind you, there might be advantages in doing exactly that, since, after all, this way the organization might be able to invent and/or identify better processes.

And, then, the second pre-condition for “everyone starting to do low-code development” would be to either have standardized business processes in the organization (so, any improvement that someone comes up with would be useful for others), or to implement some sort of “incubator” program where the organization would be able to identify the improvements made by individual citizen developers and adjust  remaining business processes accordingly. Which, in the end, will lead to standardizing the processes across the organization.

Again, those scenarios are not, necessarily, mutually exclusive. There could be well-defined data and processes in the organization, but there could still be room for significant process adjustments (not just for minor process improvements/“automation”).

Let’s say there are people in the organization who can see how  turning to low-code development might be useful to them, and let’s say there is some structure to the organizational data / processes those folks can rely on.

There can still be internal / external regulations and rules that the organization has to follow, and that may have to be accounted for while doing any sort of development (be it low-code or pro-code). Externally, those rules may come from various source, and of course, various data protection laws would be one such example. Internally, there might be a need to ensure integration of certain data with specific systems – for example, when taking a payment through a low-code app somehow, and assuming finance users are relying on SAP to see all financial data, some kind of integration may have to be implemented to ensure all payment data ends up in the SAP.

How would you enforce that on the organizational level? Since it is very unlikely every low-code developer would just start following those rules. The only way I see that happening is by somehow implementing an application development methodology that would be followed by everyone in the organization, and, then, you would need someone (possibly a team) to oversee proper implementation of that methodology on each “project”.

This could all be represented by a funnel like this:

image

In the end, low-code development has a lot of potential, and it’s true that “everybody can start building applications”. Whether they will, and whether they even should, depends on a number of things, though, in the end, it all comes down to whether the organization is seeing this as an opportunity to improve, whether it has create the environment that encourages application development by “everyone”, and whether it has the mechanisms to ensure that development is done consistently across the board.

This is not to say it’s not possible, and, where possible, it might be really beneficial, but, in many cases, getting there is not a small feat.

Testing a polymorphic lookup

There are polymorphic lookups now, but what’s the big deal? Well, I don’t have to go far to find an example of where it might be helpful. Here is a screenshot from the actual model-driven form on one of the projects I’ve been working on:

image

There are 3 different lookups on that screen, yet only one of them is supposed to be populated. Which means there are business rules involved, possibly validations, etc. I could simplify the screen above and get rid of those business rules right away by adding a polymorphic lookup – that would be a single lookup which would reference either of those three referenced tables.

As of today, polymorphic lookups can only be created through the API, and the link I put right at the top of this post provides all the required details. Except that… there is an error in the json source:

image

Heh… they had to make a mistake somewhere, or I’d really have to start worshipping the product team for the goodies they are deliveringSmile

Anyways, to create a polymorphic lookup we need something to send an HTTP POST request to Dataverse, so I used Restman for that:

https://chrome.google.com/webstore/detail/restman/ihgpcfpkpmdcghlnaofdmjkoemnlijdi?hl=en

Here is how the whole thing looks like, and I’ll provide the json I used below:

And here is the json:

{
 "OneToManyRelationships": [
   {
     "SchemaName": "new_test_contact",
     "ReferencedEntity": "contact",
     "ReferencingEntity": "new_test"
   },
   {
     "SchemaName": "new_test_account",
     "ReferencedEntity": "account",
     "ReferencingEntity": "new_test"
   },
   {
     "SchemaName": "new_test_systeuser",
     "ReferencedEntity": "systemuser",
     "ReferencingEntity": "new_test",
	 "CascadeConfiguration": {  
        "Assign": "NoCascade",  
        "Delete": "RemoveLink",  
        "Merge": "NoCascade",  
        "Reparent": "NoCascade",  
        "Share": "NoCascade",  
        "Unshare": "NoCascade"  
     }
   }
 ],

 "Lookup": {
   "AttributeType": "Lookup",
   "AttributeTypeName": {
     "Value": "LookupType"
   },

   "Description": {
     "@odata.type": "Microsoft.Dynamics.CRM.Label",
     "LocalizedLabels": [
       {
         "@odata.type": "Microsoft.Dynamics.CRM.LocalizedLabel",
         "Label": "Test Client",
         "LanguageCode": 1033
       }
     ],

     "UserLocalizedLabel": {
       "@odata.type": "Microsoft.Dynamics.CRM.LocalizedLabel",
       "Label": " Test Client",
       "LanguageCode": 1033
     }
   },

   "DisplayName": {
     "@odata.type": "Microsoft.Dynamics.CRM.Label",
     "LocalizedLabels": [
       {
         "@odata.type": "Microsoft.Dynamics.CRM.LocalizedLabel",
         "Label": "TestClientLookup",
         "LanguageCode": 1033
       }
     ],

     "UserLocalizedLabel": {
       "@odata.type": "Microsoft.Dynamics.CRM.LocalizedLabel",
       "Label": "TestClientLookup",
       "LanguageCode": 1033
     }
   },

   "SchemaName": "new_TestClientLookup",
   "@odata.type": "Microsoft.Dynamics.CRM.ComplexLookupAttributeMetadata"
 }
}

With that, I now have my lookup field added to the new_test table:

image

Although, it does not look quite right there, since, it seems, I can only see one related table in the designer. But, well, that’s only a preview after all, and that seems to be just a minor inconvenience.

Once this new field is added to the form, I can use records from either of the 3 referenced tables to populated the lookup:

image

Now, here is what I was interested in beyond the fact that we can reference multiple tables. What happens on the TDS endpoint side? Since this is what I’d be using for Power BI paginated reports. And, actually, it’s all good there. Here is an example:

image

So, basically, TDS endpoint will give us three columns for this kind of lookups. The first column would provide referenced record guid, the second one would provide referenced record type, and, finally, the third one would provide referenced record “name”. Which is more than enough to do proper reporting in Power BI/SSRS.

And what about the advanced find?

It’s interesting there, since, when specifying filtering conditions, I can see combined list of possible operators. For example, on the screenshot below, I’ve selected “Equals Current User” (and this is a polymorphic lookup, so I could have selected “Equals” and pick an account instead):

Still, if I used “Equals” condition, I could pick from either of the three referenced tables:

Well, that’s some cool stuff from the product team, and the only question for me is whether it’s something I should start using in production? Or if I should wait till this is out of preview? Holding off for now, but this is really testing my patienceSmile

Using Environment Variables to configure Word Template action in the cloud flows

Power Automate word templates are much more fun to work with than the classic D365 word templates, yet we can happily use them in the model-driven app with a bit of javascript:

https://www.itaintboring.com/dynamics-crm/power-automate-word-templates-in-model-driven-apps-forms-integration/

However, how are we supposed to deploy flows containing “Populate a Microsoft Word Template” action in different environments so that each environment had its own version of the template? Assuming this is how we would ensure proper ALM for the template files (wow that sounds crazy, but… we would not want developers to just update production version of the templates, right?)

To start with, let’s look at how that action would, normally, be configured:

image

Although, those user-friendly names are not, really, what the connector is using behind the scene. We need to look a bit deeper by using “peek code” option:

image

There are a few “id” strings there, and that’s what the connector is using instead of the friendly names it’s showing us.

Therefore, if we wanted to configure the action to take in dynamics values for those parameters, we would need to know those values. One way to get them for each environment would be to simply point that action to another environment, use “peek code”, copy the values, then do it for another environment, and so on.

Let’s assume we have correct values for each environment.

The next step would be to create 3 environment variables, configure default values, and use them to configure action parameters:

image

It’s worth noting that, unlike flow variables, environment variables will actually have values in the “design time”, which means the Flow will be able to parse the template. If you tried using flow variables, they would not be “set” yet, since it would only happen at run time. Hence, the action above would not be able to find the template in the design time, and you’d be getting an errors.

By I digressed. We are using environment variables above.

Now, all you need from here is:

  • Export your managed solution that includes the flow and required environment variables
  • In the target environment, open Default Solution, locate those 3 variables, and configure custom values which would be specific to each environment
  • You may need to turn the flow on and off, since cloud flows tend to cache environment variable values

For example, here is how one of those env variables is configured:

And voila… Every environment will now be loading document template from the proper location, yet those templates can even have minor differences as you can see on the screenshot below:

image

And that’s it.

PS. Btw, can’t help but mention implementing this kind of “ALM” might be a pain in the neck with the classic word templates since they are not solution aware at all, and the only way to copy them from one environment to another would be to manually update template files to match proper entity id-s… or to use XrmToolBox for that, since, I believe, there is a plugin there.

Upcoming pricing changes for Power Apps

With the recent announcement, one thing is clear: Power Apps licensing has never been cheaper. With no licence minimums or other purchase requirements, pretty much any organization should be able to start using Power Apps now:

image

Although, as it usually is with licensing, there is a fly in the ointment. Those changes are coming into effect on October 1, 2021. Until then, even though there is a promotional offer of $12 ($3 on the per app plan), that offer is only applicable to the minimum  purchase of 5000 licenses.

Either way, cheaper licensing is always a good thing, and I’m happy this is happening.

But, then, I still remember the time when Dynamics CRM 2011 was somewhere in the range of $40-45 per user licence, and, from there, licensing fees only kept growing.

Funny enough, current Power Apps pricing is exactly the same, but, since first-party applications are not included into that price, it only seems fair that Power Apps pricing should have been less, right? Since, after all, we are, essentially, paying for the platform access, but there is no out of the box business functionality included there.

This seems to be an ongoing problem with Power Platform licensing. We all may have some idea of what is fair and what is not. Microsoft may have some idea, too, and, as this announcement shows, they might actually be quite a bit off… to such an extent that they can cut licensing fees by 50%… but none of that is going to mean anything until, somehow, Power Platform licensing fees get translated into the resource consumption fees so that we could see underlying resource usage and associated fees.

At least that way we could clearly see how $20 or $5 translates into CPU / memory / traffic / etc usage. Apparently, there would also be some licensing fee for the “platform access”, but, right now, this is all bundled together, and, so, I’d think those prices can go up or down depending on how it all balances out in the books year after year.

Hence, it’s great the prices are going down. I think it would have been even better if there was a clear explanation of how those prices are set (in relation to the Azure resource consumption fees in general). Without that, will the price go up next year? Will it go further down? It’s kind of hard to say. Well, still, it does not change the fact that we just got much better pricing, so… have fun with Power Platform!