Monthly Archives: June 2017

Let’s rule the business process (Part 2)

As promised, here is the second part of the “let’s rule the business process.. ” post.

There is an unmanaged solution you can download here:

That solution includes a custom entity, a business process, a business process entity, and two workflows. Basically, there is everything you need to try it yourself.


There are a couple of error messages which can tell us some details about the internals of the business processes:


To move to the next stage, complete the required steps

Business Process Error
To move to the next stage, complete the required steps. If you contact support, please provide the technical details.

This one was fully expected – it happens when there are required steps which had not been completed before I tried moving the process to the next stage.


Transition to stage … is not in the process active path

Business Process Error
Invalid stage transition. Transition to stage … is not in the process active path. If you contact support, please provide the technical details.

This error happened when I tried to sort of jump the stage. Instead of moving the process to the next stage, I tried to skip one of them. As you can see, Dynamics has some internal validation mechanism to prevent this from happening.

That’s, pretty much, it.. One other note: those errors are happening exactly in this order. Required steps will be validated first, and, then, once everything is good there, active path will be validated as well.

Let’s rule the business process.. with a workflow!

I was reading a recent post by Dynamics team – they are talking about different ways to manipulate business process stages in Dynamics 365 there:

Best practices for automating business process stage progression in Dynamics 365

It’s a great article, and it has some interesting insights. What definitely caught my attention is that, it seems, we should be able to manipulate business process flows using the workflows now. Isn’t it what we always wanted to do? Imagine how you could verify some conditions in the workflow and move the process flow to the right stage depending on whether those conditions have been met or not. That would add quite a bit of automation to the whole process.

And, it seems, it’s really doable now. I did manage to make it work – just keep reading and you’ll find out how. However, keep in mind this whole article applies to the post fall 2016 versions of Dynamics 365.

First, here is what we need to know:

  • For each business process flow there is an entity in Dynamics now. You can give it a name when creating a business process flow
  • That entity, among other fields, has two lookup fields: a lookup to the main entity (the one for which this process flow is supposed to be running), and a lookup to the process stage entity
  • Every time a process is activated, a new record of that entity type is created
  • Active Stage lookup field in that record gets updated every time we move the process through the stages

To test how it works, I did set up a new entity and a new business process. Here is how the solution looks like:

  • Process Demo entity is the main entity which will have a process etc
  • Main Demo is the entity that was created for the Main Demo business process(there an entity per process, remember?)
  • There is Main Demo process
  • And there are a couple of workflows – I’ll explain what they are for down below

It turns out we can really use business process entity in the workflows just like we can do it with any other entity. However, there is a little problem there. Business process entity is related to the main entity as N:1 (many process instances, one main entity). Which means we can’t really access active business process from a workflow running on the main entity (it would be like asking a workflow running on the account record to access a particular contact linked to that account).

However, this relationship is, really, 1:1 rather than N:1. At any time, only one active process of that particular type can be associated with the main entity.

In other words, if we wanted to make it work, we would need to add a lookup from the main entity to the process entity. And that’s the first step – let’s add a lookup:


Once there is a lookup, we can use it in the workflow running for the main entity. Problem is, we still need to have some value in that lookup first.

That’s what one of those workflows is for:

BTW, do you see something unusual there? Where did that new trigger called “process is applied” come from? It seems to be one of the features of this new version of Dynamics, and it’s exactly what we need. This workflow will start whenever a process is applied to the main entity, and it will populate the lookup field on the main entity.

Once it’s done, we can use that lookup value in the workflows running for the main entity.

That’s what the second workflow is for:

The workflow will run when a new record is created or when a checkbox field is checked off. It will verify whether the checkbox is selected, and, if yes, it will modify active process stage.

And that’s, actually, it. I just got a workflow that can change active stage of the business process!

PS. Quite frankly, this was a little condensed version of how it all happened, so, here is what you’ll find in the next post:

  • I’ll provide a link to the unmanaged solution file
  • I’ll describe a few error messages I’ve run into while working on it

See it here:

A closer look at the Organization Insights Solution for Dynamics 365

I was looking at the Organization Insights tonight ( ), and I just realized that it seems to be a very special solution:

When looking at that dashboard, don’t you think it would be great if we were able to get access to that information from Web API? Or, possibly, if you could set up some alerts/notifications? Usually, all the solutions we will deploy to Dynamics will follow the same approach:

  • They will create some custom entities to store solution-specific data
  • They will create dashboards, reports, and other UI components so you had a way to work with that data

In case with Organization Insights, it seems that we are getting the presentation “layer”, as usual, but we are not, actually, getting the data “layer”.

Technically, Dynamics 365 / CRM has never had direct access to the database/ hard drive. So, the kind of information you see on that dashboards is not, really, available in the regular Dynamics solutions:

  • We can’t use SDK/Web API to estimate storage usage
  • Which means we also don’t have access to the storage details per table
  • We don’t have any statistics on the number of API calls
  • Etc

In other words, Organization Insights solution is not, really, your regular Dynamics solution – it’s, likely, more an external component that is embedded into Dynamics through that dashboard.

So, where is it taking the data from?

I used Fiddler to capture the traffic quickly, and here is what showed up in the log:

There seem to be a separate service running on a different url (different from my Dynamics organization url), and, it seems, that’s where Organization Insights solution is getting all the data from.

By the way, I am almost certain it’s not a coincidence that both organization insights service and Dynamics organization have the same domain name:

Notice how the log starts from my organization url, then it goes to the (where it seems to be doing some sort of discovery), and, from there, it goes to the (where, it seems, it’s finally getting the actual data).

I am guessing is, really, some sort of an external web service which is more or less independent from Dynamics, and which has access to the server-side statistics/information which is not available from within the Dynamics. It might be possible to sort of piggy back that service if we wanted to utilize Organization Insights data from within Dynamics, although, unless Microsoft actually documents the usage of that service, that kind of solution might not be very reliable in the long term.

There is another interesting implication, though.

There was a question in the community forums recently where somebody was trying to compare Organization Insights statistics on the AuditBase table with the information you can see in the Audit Log Management in Dynamics:

There can be quite an impressive difference between those two numbers – if I ‘m not mistaken, in that particular case it was about 10 GB under Log Management vs about 35 GB reported by the Organization Insights.

Why? It turned out Organization Insights was looking at the size of the database index as well (it’s also counted against your storage allocation, so it definitely makes sense to include index size into the overall storage usage numbers); however Log Management is not looking at the size of the database index. This kind of difference is not only impressive, but it can be costly to ignore it.

So, Organization Insights solution does bring in quite a bit of unique insight which we can’t really get form anywhere else when working with Dynamics 365 (online). It does it by utilizing some kind of external web service – apparently, we are not supposed to be using that service directly.. not yet, at least. Since this solution is utilizing a special web service hosted in the cloud, there is, probably, no intention to make it available to the on-premise customers. That said, it’s a very useful solution, especially for Dynamics online clients, since it’s, probably, the only way to figure out the exact details of Dynamics storage utilization without creating a support ticket with Microsoft.

The mystery of the OwningBusinessUnit, or how can we use Advanced Find to locate all records in a particular BU?

“I have Vehicle custom entity in Dynamics. How do I find all Vehicles that belong to a particular Dynamics business unit?”

Heh.. Strange you are asking – just go to the Advanced Find and build a filter – it’s very simple, isn’t it?

Only it’s not, as I just learned.

If you look at any of the user-owned entities (we are not talking about the organization-owned entities), you will see four attributes related to this question:

The way they work is:

  • OwnerId is, essentially, a complex field. It can be a team reference, or it can be a user reference
  • When a record is assigned to a team (OwnerId is a team then), owningteam attribute is updated to reference the same team. Owninguser attribute is set to null
  • When a record is assigned to a user (OwnerId is a user then), owninguser attribute is updated to reference the same user. Owningteam attribute is set to null
  • In either case, owningbusinessunit attribute is set to the business unit of the owner (which is either a team business unit or a user business unit)

So far so good. But this is where it takes an unexpected turn.

How would you actually build a filter in the Advanced Find to answer the question above?

Can we use Owning Business Unit to define the filtering criteria?

Have a look at the screenshot:

As you can see, there is no OwningBusinessUnit there. Why would it not be there? That was the question I had to ask myself earlier today. The answer, as it turned out, is simple.. OwningBusinessUnit field is not searchable:

And we can’t really change that setting. Well, this starts looking like a very special field. However, there are other special fields, so it’s not quite unusual to find another one. What is unusual is that for some of the other entities OwningBusinessUnit IS searchable. For example, it totally works for the Contact entity:

This is where my initial confusion came from. I did remember using Owning Business Unit field in the Advanced Find. I just did not realize I was likely using it with one of the out-of-the-box entities, and not with a custom entity.

However, what it means is that we actually cannot create a filter that will be using OwningBusinessUnit attribute in the Advanced Find.

So what if we try something else..

Can we define correct filter using some combination of the OwningTeam and OwningUser attrbutes?

Well, we could try something like this:

But that would not help at all since we would be asking Dynamics to return all records which are owned by a team in business unit 2 AND by a user in business unit 2. There can be only one owner, so this would not work.

So, how about making it an “OR” condition? Let’s try Group OR?

This is where I ran into another problem:

We can’t use Group OR unless all those attributes are listed under the same entity.

Are you starting to see the problem?

What if we tried to define the filter differently?

If the owning team is not in any other BU and the owning user is not in any other BU, then one of them must be in the correct BU, right? So let’s try it:

Problem is, again, this actually translates into a fetch query that is looking for an owning team AND for an owning user.

And, then, we are suddenly out of options.

What’s left is:

  • We can build two different views (records owned by the individual users, and, also, records owned by the teams)
  • We can start customizing: add a new business unit lookup field, create a workflow/plugin to populate that field, then use it for the filters

Finally, there can be more advanced customizations:

  • we might build a pre-retrievemultipe plugin to add a condition for the owningbusinessunit
  • we might probably use XmlToolBox/SDK to update the view without utilizing the Advanced Find

With the FetchXml, this kind of fetch works fine even if I can’t get it directly from the AdvancedFind:

Either way, I don’t really know the reason why owningbusinessunit is not searchable for custom entities. Maybe it was implemented that way for the performance reasons, I’m not sure. It’s probably going to stay a mystery – there are a few of those in Dynamics.

Plug-ins registered for RetrieveMultiple are not called for FetchXml reports

This is just a quick reminder to anyone thinking of creating a RetrieveMultiple plugin: they are not called for FetchXml reports (apparently, they are not called for the SQL reports either).

Why is it important?

Usually, we would use RetrieveMultiple plugin to either modify the query (in the pre-retrieveMultiple) or to do something with the records that have been retrieved(in the post-RetrieveMultiple). For example, we might use it to implement some sort of additional custom filtering. We might also use RetrieveMultiple plugins to implement custom security. We might use such plugins to implement language translation support. There is a bunch of interesting scenarios where RetrieveMultiple plugins can be very useful.

However, this is exactly why we need to remember of that little note above: plug-ins registered for RetrieveMultiple are not called for FetchXml reports.

Our highly elaborate security won’t work in the reports. Customized filtering won’t work there either. Language translations would have to be implemented separately. Basically, we will have to re-implement all that logic in the reports as well. However, there is still “Report Wizard” in Dynamics, so anyone having sufficient permissions can create their own report, which, again, will not be aware of the customizations we have implemented in the RetrieveMultiple plugins.

Are RetrieveMultiple plugins useless, then? Not at all. They simply have their limitations.


RDL file is not valid (a horror story you should read anyway if you are an aspiring Dynamics consultant)

It was one of those unexpected errors that can drive you crazy since they provide absolutely no details as to why they are happening.  Yet, I was not doing anything extraordinary – I was merely trying to create an SSRS report for Dynamics 365(turned out that’s an extraordinary task.. sometimes), it was 11 PM on Friday night (yes, I know.. who does it at 11 PM on Friday night.. ), and I was constantly getting this error:


I was also getting it at 12 AM, and at 1 AM on Saturday, and, then, I called it a day and gave up (only to get back to it on Saturday – see below)

The error went like this:

Error Uploading Report
This report can’t upload. This issue can occur when the Report Definition Language (RDL) file is not valid. If you contact support, please provide the technical details.
Activity ID: 47ebe340-68ed-4a4c-95b0-b5e1061c2b57

And, of course, it had some details in the log file:

Unhandled Exception: System.ServiceModel.FaultException`1[[Microsoft.Xrm.Sdk.OrganizationServiceFault, Microsoft.Xrm.Sdk, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35]]: An error occurred while trying to add the report to Microsoft Dynamics 365. Try adding the report again. If this problem persists, contact your system administrator.Detail:
<OrganizationServiceFault xmlns:i=”” xmlns=””>
<ErrorDetails xmlns:d2p1=”” />
<Message>An error occurred while trying to add the report to Microsoft Dynamics 365. Try adding the report again. If this problem persists, contact your system administrator.</Message>
<ExceptionSource i:nil=”true” />
<InnerFault i:nil=”true” />
<OriginalException i:nil=”true” />
<TraceText i:nil=”true” />

Unfortunately, those details were not particularly useful.

Did I mention I never created reports for Dynamics 365 on this particular laptop? So, before I started on Friday night, I had installed the following components:

  • Visual Studio 2015 and Data Tools
  • Report Authoring Extensions for Dynamics 365

That is supposed to be a supported configuration, according to the download page:

And, yet, it was not working.

So, how do you troubleshoot an issue like this? You can keep poking around, but there is a simple test:

  • I’ve created a new report using Dynamics report wizard
  • And downloaded it
  • And loaded it into the Visual Studio
  • Did some minor changes
  • And tried to upload it back into Dynamics

Guess what – I got exactly the same error! But that meant I was onto something, and the only possible explanation was that Visual Studio had updated the report in such a way that it became incompatible with Dynamics somehow.

What could be incompatible? It would likely be the format of that RDL file. The report was working fine in the Visual Studio, though, so I spend about an hour looking at the schema, trying to get some clue.

And then I thought: is there actually a way to target different versions of the SSRS server when building a report? That was actually the right question, and there was an answer:

Did you happen to think it was the end of it? Well, why would it be.. I am writing a detective story here:)

So, no, it was not the end. I made those changes, I configured the project to target 2008-2014 versions:

It was around 10 AM on Saturday, btw..

And I tried uploading the report to Dynamics. The same stubborn error showed up again!

I spent the following 30 minutes scratching my head – this was kind of beyond me. But you can’t keep scratching your head forever when you need the report to be up and running.. so, why is it not making any difference. Was it the wrong option? I was on the right track – that was my gut feeling at that time. What did I miss? After a bit more scratching, I figured why don’t I look at the rdl again – may be I’d be able to spot the version number there? Sure enough, it was right there, right at the top of the file, and for some strange reason it was still telling me the report was targeting 2016 version:

Now this was strange..

What do we do when there is a question? We ask google. Here is what google answered:

God bless you, Google! And Rami A, whoever you are, who provided the final clue:

This is by design.


Posted by Riccardo [MSFT] on 12/18/2015 at 5:07 PM
By design, TargetServerVersion affects build output files, not source files. You deploy the build output files (which you can grab from the \bin\Debug or \bin\Release folder within your project) rather than the source files.

So there we go! If you ever run into this problem, do this:

  • Change the target server version of your report project (make it 2008, …, 2014)
  • Build/Rebuild the project!!
  • Do not upload the original rdl file to Dynamics – instead, use the one you’ll find in the “bin/debug” or “bin/release” folder

(and, probably, don’t do those things in the new environment on a Friday night when you are supposed to be getting ready for the weekend instead:) )

Masked fields in Dynamics 365

If you ever needed to create a password field in Dynamics, you first instinct was, probably, to create a javascript that will somehow encode the value entered into such a field.

Luckily, creating such fields is not exactly the most popular requirement so we don’t have to think of this too often. However, when it does happen, it may end up being a bit of a problem.. Because that javascript can turn out to be quite tricky.

So, if you are reading this post because you just found yourself in that kind of situation, there are some good news. You don’t need javascript!

Well, at least you have at least one other option which does not require coding.

You can setup a field security profile:

By the way, if you are not familiar with the field security profiles yet, you will need to read this:

What may not be clear is that you can allow “update” and “create” without giving “read” permissions.

As a result, a user who is not a system administrator, and who has been assigned that kind of field security profile will see masked values when he/she will be looking at the data:


Yet such a use will still be able to enter new value into the protected field:


This is a quick workaround, but I admit it does not cover all scenarios. For example, there is no way in this scenario to allow the user to see protected value once it’s been entered without utilizing a javascript/plugin.

Still, it’s a quick and robust solution: it works everywhere (on the forms, in the views, in the reports), and it does not take long to set up a field security profile.


Interestingly enough, if you know how to update customizations.xml file, it seems there is an attribute that you can add to the file manually and that is supossed turn an input control into a password control:

However, it did not really work for me when I tried. It did, but in some strange way – here is what I got:


It does not depend on the browser – it was exactly the same behavior in both Edge and Chrome. 


Azure Functions and Dynamics

What comes to your mind when you realize you need to extend Dynamics with some server-side code?

  • Plugins?
  • Custom workflow activities?
  • External applications?

Those are all great options, but, for many years, they have been the only options available to us. And, yet, they all have limitations. Plugins can’t run on their own. Custom workflow activities have to run as part of the workflows. External applications need a server to run on. There is always a limitation.

But things are different now!

Have you tried Azure Functions yet? If not, you probably should.

For example, how about that famous “scheduled workflow” problem? It’s been bugging us since the early days of Dynamics, and there has never been a good solution. There is still no standard/simple solution in Dynamics, but stop thinking Dynamics.. think Azure/365.

You can create a timer Azure Function that connects to Dynamics, set up a schedule for that function, and voila.. Problem solved without any tricks.

How long does it take to create an Azure Function? Assuming you know how to write custom code for Dynamics and, also, assuming you have access to the Azure Portal with all the proper permissions, it does not take long. Took me about 1-2 hours without any previous experience in Azure Functions, though these three links were of great help:

Those links have all the information you need to create an Azure Function that connects to Dynamics, so it’s probably not worth it repeating all the same details in this post, but I’d like to summarize what I got, eventually:

  • An Azure Function
  • That uses CrmServiceClient to connect to CRM
  • Then, it queries data from Dynamics using a FetchExpression
  • Finally, it updates records in Dynamics
  • And it is scheduled to run every hour

No need for a scheduled workflow.. no need for an external application.. No need for a dedicated server.. And this is not to mention it was unexpectedly easy to do it!