Author Archives: Alex Shlega

Onload events for form updates


Form notifications can be a little sensitive as I found out this morning when the notification I expected to show up was totally misbehaving. Of course it was Monday morning, so my first thought was that, maybe, that notification just had a really good weekend and simply was not willing to get back to work, but, as it turned out, it actually needed some help to get back on its feet again.

Essentially, my form notification was supposed to tell the user that there are some validation errors. Those errors would be stored in a text field which, in turn, would be populated from a plugin. So the whole process would look like this:


Here is the piece of plugin code just so you could see it:


And here is a piece of the related javascript:


So the plugin would put validation results into the field, and the script would look at that field to either display a message or to clear the notification (the script is registered on both form OnLoad and field OnChange).

All good? Well, no.. That part with “clearFormNotification” worked fine in the form “OnLoad” but did not want to work with the field OnChange whenever there were no validation errors.

So I started looking around and found this tip:

Which basically proved the thought that had started to form by that moment.

This kind of workaround with OnChange works only if there is some value in the field. It does not work, though, if the field value can be set to null as a result of the update operation.

So, as per the tip above, I’ve introduced a new boolean field, updated the plugin to always set that field to false or true(never to null), and attached OnChange event handler to that field instead of the original one.

My notifications are feeling well and happy now!


This or That #1: PowerApps solution designer vs Classic solution designer

You probably know that we do have two solution designers these days.  Here is a related “this or that” question, then. There is a new designer, and there is an old designer.. Which one are you going to use moving forward and why?

Btw, would you rather see and listen? Just scroll down to the bottom – there is a video there.

The old one offers classic solution designer experience:


The new one is aligned with the PowerApps interface:


Up until very recently, I was thinking of the new solution designer as of an emerging tool that will sooner or later overtake the classic designer. However, I sort of thought this would be happening gradually by introducing new “convenience” features such as WYSIWYG form designer, but, in general, classic designer would stay relevant until it’s, finally, disabled (and at that time new PowerApps designer would have to cover the functionality classic designer used to cover).

What it seems I did not realize is that the product team may have decided to take a slightly different approach. There are features which have been missing even in the classic designer, so they would have to be implemented in both versions. Now would it really make sense to keep adding those new features to the tool that’s probably going to disappear anyway? Or would it make sense to pivot at this point and say that those new features will be implemented in the new tool only?

Actually, I am not sure if this is how PowerApps product team is looking at it, but, judging from what has recently been delivered, it might well be how it is:


Yes, we do have an out of the box interface to create Autonumber fields now without having to resort to the XrmToolBox or SDK calls. But.. We only have that new field type in the new interface. There is no such option in the classic UI:


But, then, is there anything that’s completely missing from the new designer? There might be other things, but I figured I’d just dig in the more advanced areas, so I looked for the Field Security settings, and, from what I can see, there is no way of enabling/disabling Field Security on a field in the new designer yet:


So, things are getting really interesting because there is no definitive answer. There is certain functionality that’s available in the new designer and that is not available in the classic one. But there is, also, some functionality that’s available in the classic designer and is not available in the new one.

Personally, I think it only makes sense to start using new designer experience wherever I can simply because it’s certainly the version that’s going to stay, and, also, it’s the version that seems to be getting new functionality first:


Just one note.. Why did I write “first” above as if I were thinking that classic designer might still be updated? See, everything is great with the new designer except that it’s not available on-premise. So it could be that, at some point, those new features will still be added to the classic designer as well.  We’ll see.

If you were looking for a recording of this episode, here you go:

PS. Have a look at the other “This or That” episodes !

Good old validations, and why the plugins/workflows are still alive and kicking


Interesting how, with all the power Flows can offer these days (we can even customize Sharepoint integration with Flows), there is still one scenario which just cannot be covered without the classic plugins/workflows.

Namely, anything that requires synchronous (or real-time) processing, would need a plugin or a workflow.

For example, what if you wanted to intercept all create/update operations in such a way that nothing gets saved when the data does not validate?

The diagram below would not reveal anything new to the folks who have been working with Dynamics for a while, but, if you are just getting into the model-driven applications development, this may be something to keep in mind:



Although, what complicates this a little bit is licensing. Validations are great, but, if there are users utilizing your data in a Canvas Application on Plan 1, adding a plugin/real-time workflow to an entity exposed to such an app would require those users to go up to Plan 2.

Anyway, just to make this post “complete”.. how do you display server-side validation errors in the interface?

Here is how you can do it from a real-time workflow:

And here is how you can do it from a synchronous plugin:

Creating custom folder structure for Sharepoint integration using Flows

We’ve come across an interesting problem recently. Imagine that you have a few different business units, and you want Sharepoint integration to work in such a way that all the documents within each business units would be shared, but there would be no sharing between the business units.

This is not how it normally works in Dynamics, though, so the options are:


The permissions replicator is great, and it would do the job, but, just to explore that custom implementation option a little more, what can we actually do?

If it were on-premise, or if it were a few years ago, we’d have to think about a plugin for this.

But, since we have Flows, I figured it would be interesting to give it a try. And it worked.. Not without a few quirks, but I did not have to write a single line of code. Which is unusual to say the least.

Here is how it happened, step-by-step.

1. Creating a solution

First, we need to create a solution..  I’ll be creating a flow in the next step, so I figured would be just the right place to create this solution:


2. Creating a flow

I will add more details about the flow down below, but, at this point, here is a screenshot of what the flow will look like once it’s built:


3. Preventing default document location logic

There is a bit of a problem with this whole approach. What if somebody created a case and jumped to the “related documents” right away? The flow above is asynchronous, so it won’t create document location immediately. If a user navigates to the related documents too quickly, Dynamics will create default document location.. so we need to prevent that from happening somehow.

That’s what this real-time workflow will be doing:


It will run “on create” of the document location records, check if that record is regarding a case AND if it does not have a keyword in the Name field (which will be added to all locations created from the Flow above), and, then, will stop the workflow as cancelled.

Honestly, the error message is going to look somewhat nasty:


After a minute or so that grid will get back to normal, but, of course, the users will have to refresh that screen:


Besides, most of the time Dynamics users would not create a case just to start uploading documents right away, so there is a good chance that error would not be showing up in 90% of the cases. Still, there seem to be no way around it except the notorious “user training” (as in “yes, if you navigate to the related documents too fast, you will see an error message”)

Actually, the screenshot below shows exactly what’s happening in Dynamics as a result of the flow execution.

And, of course, a folder gets created in Sharepoint:


4. What about that flow, though?

Step one is a trigger. We need to create a document location for every new case.

Steps 2 will get case owner user record from Dynamics – this is to get to the business unit.

Which is what step 3 will do – it will query case owner’s business unit record from Dynamics.


In the next step, the flow will query document location record by name:


This is part of the setup, actually.

For every business unit, an administrator will have to do 2 things in advance:

  • Create a sharepoint library and setup permissions in such a way that only BU users will have access to that library
  • Create a document location record in Dynamics pointing to that sharepoint library (and having the same “name”)


Here is an example of the sharepoint library:


And here is a corresponding document location record:


So, basically, the flow will find that location by name (treecatsoftware is the business unit name), and, then, will use it as a parent location when creating document location for the cases in that business unit.

Finally, we need a foreach in the flow. Technically, there is supposed to be only one record that matches the condition (name = business unit name). I just don’t know how to reference exactly the first one in the record set, so I figured I’ll go with foreach:


This first action above will create a folder in Sharepoint for the new case. It’s an http request Sharepoint connector action, and here is the URI:


And the second action will create a document location in Dynamics:


Here is a link to the exported solution (I did not try to import it anywhere yetSmile ):

Unexpected goodies just keep showing up!

I am not sure what’s been happening to the PowerApps product team recently, but, whatever it is, I think a “thank you” is in order.. it just seems they are really working on tiding up the UI these days.

Just today, I noticed a couple of new(?) things, and it feels like.. an unexpected gift. No, really.

I can scroll through all the records in the lookup control now. It will open up with some records loaded, and, then, it will load more data as I keep scrolling. This is it – dark days of the classic UI are over!


But there is one less reason to use optionsets in place of lookups now.

Actually, I don’t know when this change happened. Has it been there for a while and I just did not see it?

What I do know, though, is that only a week ago I made this post:

And I‘m pretty sure the post itself does not have anything to do with the fact that it’s not an issue anymore and the grid is not truncated:


Well, it’s been a good day, it seemsSmile)

Using FetchXml in the Flows


Having established (for myself.. and not without the help from other folks) that CDS Connector is the way to go when working with Dynamics or CDS data, I quickly found that ListRecords action does not support fetchXml. Or, at least, I don’t see a parameter for that:


That said, WebAPI endpoint does support fetch execution:

So this seems to be current limitation of the CDS Connector(intentional or not).

Technically, fetchXml is more powerful  than OData when it comes to building complex queries, when traversing the relationships, etc. I am not sure what the future holds for FetchXml, since it’s a proprietary “query language” developed and maintained by Microsoft, but, for now at least, it’s a legitimate way of querying data from Dynamics and/or from CDS.

So, what if I wanted to use FetchXml to validate some advanced conditions in my Flow?

Not to go too far into the complexities of FetchXml, let’s try adding a condition that verifies if a primary contact on the account that’s just been created has associated cases. The query would look more or less like this:


And, if that query returns any cases at all for the account, I’d like my flow to add a note to the account description field.

So, to give you an idea of how my flow will look like, eventually, here is a screenshot:


The first step is straightforward – it’s a CDS Connector trigger which will kick in whenever an account record is created.

The second step is where I’ll actually run FetchXml.

The third and forth steps are all about parsing the results and verifying the condition.

For the second step, even though it’s probably possible to do the same with a pure HTTP connector, I figured I’d use an Http with Azure AD (Preview) connector instead. Turned out it does take care of the authentication already, so I don’t need to worry about that (with a note, though, that I am not exactly sure what’s going to happen when/if I add this flow to a solution and export/import to another Dynamics CE instance.. will try it later).

There seem to be two tricks about that connector, and it took me a little while to figure them out (I’ve almost given up to tell the truth). When you are adding it to the flow, you’ll be presented with this screen:


I used “Invoke an HTTP request” action in my flow(and you’ll see below how that action was set up), so, let’s say you’ve selected that action.

Depending on something in your environment (and I am not sure what it is exactly), you will see one of these two screens after that:

Screen a:


Screen b:


If you see screen a right away, that means your HTTP with Azure connector has picked up a connection, and it’s not necessarily the right connection. In my case, I quickly discovered (once I tried running the flow) that my action was permitted to access Sharepoint but not Dynamics – that must have something to do with Azure AD OAuth, although I am not 100% sure of what’s happening behind the scene yet.

So, if, somehow, you run into this issue, make sure to verify the connection your action has picked up, and, if it’s not the right one, create a new connection:


This will bring you back to Screen B from above. Once you are there, fill in the textboxes like this:


Do not put “.api.” in the url-s as you would normally do when accessing WebAPI endpoint. Just use root url for your dynamics instance.

After that, sign in, and you’ll be back to Screen A with the right connection this time.

The rest should be straightforward..

Set up the action like this:


  • Choose Get method
  • Make sure to use root instance url for the request (do not add “.api.” in the middle)
  • Add FetchXML to the url (download it form the advanced find, update as required, etc)
  • Don’t forget to update filter condition in the FetchXML so that the request is using correct account id


Next, add Parse JSON action like this:


You can just run the same url that you put into the Http action directly in the browser to generate sample data, and, then, you can feed that sample data to the “use sample payload to generate schema” tool of the “Parse JSON” action.

And the last step – just add a condition:


I used an expression to get the total # of records in the result set (length function) – you can see it in the tooltip on the screenshot. But, in retrospect, fetchXml aggregation might work even better for this scenario.

Do something when the condition evaluates to true (or false), and the flow is ready:


Time for a test drive? Here we go:



Using different connectors with Dynamics CE instances when creating a Flow


There are at least 3 different connectors we can use with Dynamics CE instances when creating a Flow:

  • CDS Connector
  • Dynamics 365 Connector
  • Http Connector

They are not identical, though, and, from what I have gathered so far, we may need to keep a few things in mind when choosing between the 3:

1. When using a CDS connector, make sure your environment has been upgraded

Otherwise, you won’t be able to see your Dynamics CE database from that environment. For example, here is what showed up when I tried creating a flow in the old Default environment which was never upgraded:


And here is what shows up when I’m working with a flow in my Dynamics CE environment:



2. There are rumors that Microsoft has stopped working on the Dynamics 365 Connector

That said, most of the Dynamics 365 connector functionality seems to be available in the CDS connector. Although, there is no “Create or Update” trigger in the CDS connector, and it’s not unusual when we need the same processing to happen in response to either of those events.

If you are using a CDS connector, you may, then, consider starting to use nested flows:

Which would, basically, look like this:


And yes, personally I think that’s quite a complication for somebody used to the native Dynamics workflows. Which is beyond the point, though, since those are different “technologies”.

However, unlike Dynamics connector, CDS connector has “Current” environment option which is why it can be packaged into solutions. You move it to another environment, and it’s still “Current”, so it just works.

3. HttpConnector is a low-level approach, and, just like any other low-level approach, it’s more complicated to implement, but, of course, it gives you most of the control

Mostly, it seems, we’ll be dealing with these two problems:

  • There is no seamless authentication – here is how you can try working around it:
  • Forming the requests and parsing the responses – both CDS and Dynamics connectors have a rather good idea of what the environment looks like (entities, types, etc). Http Connector, on the other hand, will make no assumptions at all, so you’ll need to know the entity names, you’ll need to know how to write Web API queries, etc.


But, in return, you’ll be able to do everything(?) that can possibly be done with Web API

Updated Advanced condition builder and the importance of the social media

You would think those two things in the subject line don’t have much in common, but I just had a eureka moment, which, I believe, had a lot to do with the following three facts:

  • I am very new to Flow
  • There are things in Flow which are very new to Flow as well
  • And there are people which are exploring all those new things and letting us know of their findings on the social media

Imagine.. I am looking at how to create advanced conditions in Flow so I can do some filtering when an update/create (Dynamics) trigger kicks in, and there is this great video by Elaiza Benitez where she explains those things for the newbies like me:

Great, eh?

Well, of course, except that there is no advanced mode anymore:


So I am rewinding her video to see if I missed anything, and it just so happens that Andrew Ly makes a post on Linkedin where he is praising updated condition builder, and I see it in my LinkdeIn feed just when I am trying to figure it all out:


This is where it hits me. Aha.. Maybe some things have changed?!

So, I am hoping you know by now that there were some changes and they are described here:

In my particular scenario, turned out it really was not that difficult to create a condition I was looking for – I just had to use pretty much the same expression that you’d see in the video above to convert my original field value to 1 or 0 depending on whether that value was empty or not:


if(empty(triggerBody()?[‘_parentcustomerid_value’]), 1, 0)

And, then, I used computed value in a simple “equals to” condition:


Simple indeed, but I am still thinking of the mysterious events of the last hour or so which involved:

  • A youtube video by Elaiza Benitez
  • A linkedin post by Andrew Ly
  • A blog post by Stephen Secilliano

All of which took me from “how the hell am I going to do this” to “it was not that bad at all”.