Monthly Archives: June 2018

Moving Dynamics 365 instances between tenants – doable or not?

It’s just something I was exploring today in preparation for the MB2-715, and, it seems, there are ways to move instances from one tenant to another without having to actually migrate the data. At least that seems to be the case according to the discussions here:

This is likely going to involve a support request, but, nonetheless, it should be doable..

Easy Repro: what is it?


Easy Repro seems to be a bit unusual name for  what is, basically, an automated UI testing framework, since it’s not so much about reproducing the issues as it is about testing various UI scenarios:

And I wanted to emphasize it right here. Easy Repro is not a product.. neither it’s a technology.. It’s relatively high-level library/framework developed by Microsoft to facilitate automated UI testing specifically on the Dynamics 365 projects.

What’s powering that framework is the SeleniumHQ browser automation open-source project:

Selenium, and, more specifically, Selenium Web Driver, gives us the way of controlling the browser from code. For the purpose of Easy Repro it’s all about C# code, but, in general, it can be Java, PHP, etc.

From that standpoint, a UI test using Selenium Web Driver would normally look like this:


  • There would be some test code
  • Which we would use to instruct the web driver to execute certain browser commands
  • We would also use it to verify what’s been loaded into the browser in response to those commands(HTML elements, javascripts, etc)
  • Finally, we would do required validations


This is all doable without Easy Repro – all you need is a Visual Studio and a quick “how to”.. for instance, you might use the one here:

However, the main problem with doing it that way would be that you’d have to do a lot of HTML parsing. For every test you want to write, you’d have to identify HTML component so you could fill them with proper values; you’d have to find submit buttons; you’d have to analyze drop down menu structures; you’d have to.. hope you get the idea. And, in order to verify the results of your tests, you’d have to dig into those HTMLs; you’d have to find the identifiers; extract values; compare them with the expected results. As the number of tests starts growing, you’ll have to keep maintaining those test scripts, and this is going to start taking away time from other activities. Which might still be worth it because of the benefits you’ll get from automated testing, but this is where, if you wanted to make it somewhat easier to do, Easy Repro might save you some time and efforts.

That’s because, as suggested by Martin Fowler:

“When you write tests against a web page, you need to refer to elements within that web page in order to click links and determine what’s displayed. However, if you write tests that manipulate the HTML elements directly your tests will be brittle to changes in the UI. A page object wraps an HTML page, or fragment, with an application-specific API, allowing you to manipulate page elements without digging around in the HTML.”

Easy Repro is doing exactly that – it’s introducing a level of abstraction so as to minimize the need to work with HTML directly by isolating all that low-level work in the framework code. Just have a look:

– There are 3 projects in the solution


– Out of those 3, the first two (API and Browser) represent the actual framework

You will, actually, find classes such as LoginPage, Office365Page, etc in the API project


–  And the last project (Samples) shows you how to write the tests

For example, here is a sample test for creating a case in Dynamics


If you look at the code above, you’ll see how it’s calling a number of methods, but it’s not dealing with the HTML directly, and that’s exactly what Easy Repro is for. It gives you the classes (and methods) to work with various pages in Dynamics 365 so that, instead of having to work with the Web Driver by finding the HTML elements to click on etc, you can work with that high-level model and just call methods from that model.

If there are any changes in the underlying HTML model when an update for Dynamics is released, Microsoft will be updating Easy Repro classes on github, so you will only need to download updated classes/assemblies, and all your custom tests which are using Easy Repro will continue to work.

All that said, I think it’s still an open question whether it’s going to be easy to maintain those tests, even when utilizing Easy Repro. Though that might also depend on how advanced you would want those tests to be.

For example, when looking at the sample tests, you will rarely (if at all) find tests reading a value from any field. Although, there is a GetValue method in the XrmPage class. Most of the tests are all about setting some values, then saving the changes, so those samples don’t seem to take synchronous plugins/real-time workflows into account. As an alternative, maybe it would make sense to combine UI testing with server-side data testing.. As in, once the UI test has been completed, we might want to use regular SDK classes to go to Dynamics, to query data, and to run additional validations on that data. That would not be a pure UI test anymore, but that might be part of the overall test scenario.

Either way, if you do need to automate UI testing on a Dynamics project, Easy Repro is definitely worth looking at. Just keep in mind it does require some coding skills, and you may also have to start setting time aside for the maintenance of those tests moving forward.

Scheduling Board in the Field Service – just how advanced that thing is?

I was severely underestimating the Field Service, it seems, and I am wondering if this may happen to anybody who is thinking of the Field Service in terms of “solutions”/”applications” for Dynamics.

There is a component in the Field Service solution that goes far beyond the regular capabilities of such components – it’s, essentially, an application on its own, and it’s the scheduling board:


It seems straightforward – there are things you want to schedule, there are resources, there is a calendar.. apparently there is some search functionality. So, as expected, you can create the bookings. That’s pretty advanced as it is with all the drag-drop / search tabs / etc, but what I did not quite realize until I started to dig into it is how configurable this component is.

When you double click on the scheduling board name(mind you it’s not the most obvious action you can come up with):


You are presented with a screen that has all sorts of configuration settings there. A lot of them are about visuals, but there is a seemingly unimportant “other settings” area which, once you look into it more carefully, is representing a built-in board designer:


For example, if I wanted to display the list of accounts in the “filtering” area.. like this:


It would only take me a couple of minutes to add that drop down – that’s what can be done using Other Options->Filter Layout. Notice how there are all sorts of controls there:


But that’s not all.. If I, then, wanted to use those additional filters in scheduling queries, I could have updated the resources query:


That thing alone deserves a separate blog post, but, since there is, already, such a post, I’ll just give a link here:

Just don’t think it’s “FetchXml”. It kind of is, but it’s more than that – it’s a query “language” that can use FetchXml to get the data, and, then, can use XPath expressions to process the data. You can bring data from different FetchXMl statements, you can dynamically add filters (from the filtering area) to the data, and you can order the lists.. Basically, you can completely change the way your scheduling works since you can filter on whatever you want to filter on, and you can fine-tune the resource query to choose the resources that match your filters.

PS. And if you wanted to test it quickly just to get a feeling of how it’s done, have a look at the step-by-step here:

Skipping process initialization on create

We had a scenario where we did not want to initialize a BPF when a record is created – basically, those were records for which it would take some time before anybody would really need to start looking at them. Until that moment, they would be staying “dormant” in the system, and, then, once they’ve been marked as “ready”, the process would really start.

So, if we did start the BPF right away, we would end up with incorrect process duration since all those records could be spending days or weeks in the “dormant” state, and, then, they could be processed in just a few days once they are activated.

Anyway, somehow we had to prevent Dynamics from automatically initializing a BPF for those records.

Turned out there is an extremely simple solution that’s been mentioned here:

We just had to set processid to Guid.Empty on all those records initially, and that’s just the right task for a simple plugin:

public void Execute(IServiceProvider serviceProvider)
             IPluginExecutionContext context = (IPluginExecutionContext)
             Entity target = (Entity)context.InputParameters[“Target”];
             target[“processid”] = Guid.Empty;

Register that plugin on pre-create of the entity, and that’s it.. Every new record will get Guid.Empty for the processid, so Dynamics will skip initializing the BPF.

Parental relationship does not need “assign” permission to propagate the assignment

This might be something to keep in mind.. If you have two entities with a parental relationship between them, your users may still be able to re-assign child record to others even if they don’t have “write/assign” permissions on the child entity.

In the example below, Sales Person role does not give “write” and/or “assign” permissions on the Test SLA entity:


So a SalesPerson can’t do anything with Test SLA directly:


But they can still go to the parent record which is currently assigned to me:


And re-assign that record to themselves:


And here we go – that child “Test SLA” record is, now, re-assigned to the Sales Person user as well:


Dynamics: What are your Reporting Options


When looking at the reporting options in Dynamics, it sometimes feels that there are just way too many, and, even though there are lots of good ones, there is always something that seems to be missing.

Just so we could start somewhere, let’s see what Microsoft has to say:

In other words, we are talking about SSRS, Dashboards, and Power BI. Realistically, though, dashboards are nothing but layout pages where we can add charts and views, so we should really be talking about charts and views there. Also, Power BI is somewhat limited in the on-prem environments, and SSRS is somewhat limited in the online environments.

But that’s not all. There are other options, too. After all, the purpose of reporting is to give additional insight into the data we have in the system, and, come to think of it, there are at least a few more options:

  • Advanced Find
  • Views
  • Excel export
  • Excel/Word templates
  • Power Query in Excel
  • Within SSRS category, we have reporting wizard and custom SSRS reports
  • It’s also possible to use Dynamics data with Cognos BI or other external tools
  • And, on top of that, there are calculated and rollup fields in Dynamics which we can use in conjunction with all the other options (probably more so with the views/charts)


So, how do we choose? I have compiled a table which, even if not very detailed, might get you started (and, also, might put this comparison in perspective):


Just  a few clarifications:

  • Minimum Query Limitations – there are always some limitations, mostly because of FetchXML(but, also, because of how different tools are treating fetch), but SSRS and Power BI will have less of those
  • Clickable – this is all about having clickable links in the report to open records directly in Dynamics
  • Simple Drill-Down – it just comes with the dashboards. Looks like no other tool/approach can easily beat it


Now if it looks like Dashboards are winning this race, not necessarily. Dashboards are really good for a lot of things, but, in a way, it’s a half-cooked tool. If you compare Power BI visualizations with the dashboard, you’ll see that Power BI is more powerful. The same goes for SSRS. Actually, Power BI (and SSRS) will likely beat dashboards anywhere – dashboards will be offering some basic option, and Power BI/SSRS will be offering an enhanced alternative.. except when it comes to the “drill-down” feature.

How do you choose, then? Is it on a case-by-case basis, or would you just say “go with Power BI”(for example) these days?

Plugin development: don’t use Context.Depth to prevent recursions!

It’s been said a number of times in different blogs that context.Depth may have side effects, but I am starting to think we should actually ban the practice of using context.Depth all together.

It’s almost impossible to predict plugin execution paths as long as your solution gains any level of complexity, but, even in a relatively simple scenario below, you may quickly start noticing how your data is becoming inconsistent over time:


That validation plugin at the bottom will run when Entity 3 is being updated through the UI. And it will not run when Entity 3 is being updated from one of the other plugins (each of those may run in response to some other events).

So, basically, you may either have to repeat validation logic in each and every plugin affecting Entity 3, or you may have to come up with some other way to avoid recursions. The easiest solution would be to carefully control which attributes are updated every time and which attributes are configured to trigger the plugin.

Problem is, if you don’t do that, fixing those issues in the existing solution can quickly turn into a nightmare. When developing new functionality you will be assuming that validations are working, so you might not even bother to test some of them until it’s already too late and your data has become inconsistent. At which point you’ll have to fix it somehow, explain the consequence to the users, etc.

So take your time to craft the validation criteria carefully, don’t use Depth, and you should be able to save yourself from those troubles.

Field Service: What does negative quantity mean?

From the inventory tracking perspective, finding out that you actually have negative quantity at the warehouse might lead to some interesting experiences. I mean it would be nice if we could sell things which are not there yet. And you may find yourself in that kind of situation when working with the Field Service:


This is not necessarily a bad thing, and, in reality, what you have in the system will rarely be 100% in sync with the actual inventory, at the very least because of the recording delays, but it’s interesting that Field Service solution is trying to handle it a bit inconsistently – basically, how those negative quantities will be handled depends on the situation.

For example, eventually we can get negatives:


If this happens, such products won’t be showing up on the transfer screen:


Even more – if I try creating an inventory transfer “manually” in this situation, I’ll get an error message like this:


So there are validations. And, still, I can go to an existing purchase order product receipt and update the quantity there:


For example, if I set it to 1 on the screenshot above, I’ll get the numbers updated right away:


(Looks like those calculations are not straightforward when it comes to negative numbers –  I was expecting to see –10.95 after reducing the purchase order product quantity by 4)

Anyway, point being, in some scenarios you may get negative quantities, so don’t panic if that happens, have a look at all the data that might have affected the numbers, and, then, add some validations to the process if you need those.

Field Service: playing hide and seek with the products


I was setting up products for Field Service and made a bit of a rookie mistake.. Which resulted in the same product being available in some places but not in others.

See, on the following screenshot I have a few mugs in the Main warehouse:


However, when I tried creating a new purchase order, I could not, really, find any mugs:


Turned out that, even though those mugs were already showing up at the Main warehouse (so the system did allow me to create inventory adjustment records), I had not set up the product correctly.

It was still in the “draft” state, and that’s why it was not showing up in some views. Publishing the product


Did take care of the issue:


And, on a related note, if you were wondering (like I was) what’s the purpose of the product type field


Here is what the user guide would tell you:


I am not sure this definition explains all the details, but, since the field is not mandatory, you might want to keep in mind that some lookup controls in the Field Service solution will be using filtered views where filtering will be happening on that field (and some will not be using filtering). For example, you will see a product in the list of purchase order products only if that product is either an inventory or a non inventory product:


Setting up the dev process: can we automate configuration change tracking?


It’s not unusual that we need to know what has changed since the last time we did anything in the Dynamics environment. As I mentioned in the previous post, we have some manual and half-manual options, but it would be nice to have a bit more automation there.

For example, there is a change log screenshot below – if we could get that kind of change log created automatically, that might be a good start:

The solution below is based on the original CRM Comparer tool:

Which had a reincarnation here:

And I just uploaded it (with some changes) to github:

It’s possible that the tool will not recognize some of the newer solution components (such as applications, for example), but we can work on that later.

If you follow setup instructions from the link above, you can, basically, do the following:

  • Configure the tool to connect to your instance of Dynamics
  • Create a solution in Dynamics and add all components you are interested in to that solution (make sure to keep maintaining the solution moving forward)
  • Create a scheduled task to run the following command periodically: ChangeTrackingTool.exe SolutionName


Every time that scheduled task runs, the tool will go to Dynamics, download the solution, and compare the contents of that solution with what was downloaded previously.

For example.. If you look at the screenshot above, you’ll see that ita_firstentity was created that time. Now let’s go to the solution in Dynamics, let’s add an attribute, and let’s put it on the form:

Here is how the form looked like before:

And here is how it looks like after adding another attribute – notice New Attribute on the form:

So now let’s start the tool and see what happens:

It does take a little bit to do the comparison, but, in the end, here is what shows up in Dynamics:

1. There is a new change log for the “TrackingTest” solution

2. And, if I open that record, I can see the details

In other words:

  • A new attribute on the ita_firstentity was created (ita_newattribute)
  • Information Form for the ita_firstentity was updated
  • And, more specifically, a row was added to that form


And what kind of row was that? Let’s open that particular component record:

If you ever saw customizations.xml, that XML on the screenshot should look somewhat familiar, since, basically, it’s part of the customizations.xml.

So give it a try – let me know what you think!