Monthly Archives: January 2018

Using GetAttributeValue – same result, different meaning. Depending on the context

 

We can use entity[“<attributeName>”], or we can use entity.GetAttributeValue<T>(“<attributeName>”)

The second option won’t produce an exception if the attribute is missing – it will return null instead. It will also return null if the value of that attribute is, actually, null. This may look all right, but there is a difference. So when does it matter?

Imagine these two scenarios:

 

image

In the first scenario, we would query an entity from Dynamics using the OrganizationService, we would ask for a specific attribute, and, then, we would not get it in the result if it’s null. GetAttributeValue would give us null, which is fine since we know how to interpret that result.

In the second scenario, we would have a “Target” entity from the plugin context. However, that entity won’t have any of the attributes that were not submitted from the client. We can still call GetAttributeValue, and it will still return null for such attributes, but we would not be able to say if an attribute just was not provided or if the value was, actually, cleaned up, and “null” is what’s being set (imagine a whole number field which had some value, then the user emptied that field control and hit save). This is when, if you wanted to know exactly what’s happening, it may be better to use Entity.Contains(“<attributeName>”), and, then, compare the value to null (Entity[“<attributeName>”] == null)

Dynamics: how process duration is calculated

 

When looking at the BPF-enabled entity in Dynamics, we can see process duration there – here is an example:

clip_image002

In case you were wondering how the duration (“18 days, 1 hour” in this example) is calculated, here is a diagram:

clip_image004

If the process is still active, what will show up in the duration area is simply the difference between current date and process “Created On” date.

If the process has been aborted or finished, what will show up is the difference between process “Completed On” date and process “Created On” date.

It’s also worth mentioning that “Duration” field on the Business Process Entity will only be populated once the process has been either completed or aborted. For the active processes, that column will be empty:

image

 

And below are some screenshots from Google dev tools where you can see how the calculations are happening in javascript.

Analyzing process status (Completed/Aborted/Active) – this function is setting $v_2 property to be used later

clip_image006

Creating duration text(this function is using $v_2 property from the above

clip_image008

clip_image009

Calculating process duration (get_bpfINstanceActiveFor and get_bpgInstanceCompletedIn are utilized in the code above)

clip_image011

$1i_0 and $29_0 for the code above (the former is set to createdOn, and the latter is set to completedOn)

clip_image013

XperiDo trial – displaying related records in your documents

One of the main problems with the out of the box word templates in Dynamics is that we can’t really use lists. Well, we can add a repeater, but we cannot sort those lists. And, if we cannot sort them, they are becoming somewhat useless once there are more than a few related records in the list.

This, it seems, is one area where ExperiDo can certainly do more.

For example, on the screenshot below I have 148 contacts linked to the A. Datum account:

image

Most of them I just imported using an excel spreadsheet, and those are all “Test Test” contacts. But, depending on the sort order, this list will display different contacts at the top – compare it with the screenshot below:

image

This is something we can actually do with XperiDo:

image

image

Now when I go back to Dynamics to create the document, I get all 148 contacts sorted in the right order:

image

image

XperiDo – starting the trial

Before I continue.. I’m not affiliated with XperiDo, and I’m pretty sure they have no idea I am writing this blog post right now. Still, I was looking for a document generation solution, and, since XperiDo seems to have some interesting features, figured I’d try it a few days ago:

https://www.xperido.com/support/blog/microsoft-dynamics-crm-2016-document-generation-vs-xperido

Not pretending to be an expert in XperiDo, I’ll just share a couple of observations which may help you get up and running a bit faster if you do decide to try it as well:

1. When you get a trial, you get a combo: Dynamics Trial & XperiDo trial

   Yes, they are actually going to set up the whole environment for you. You can opt out of that if you want to try XperiDo with your own Dynamics instance, but I think it’s worth doing a trial in the isolated environment, especially since you’ll be getting a bunch of pre-configured templates as part of the set up.

2. XperiDo installs an add-in for Word –  you can use that add-in to design document templates

  However, make sure to install correct version (64 bits or 32 bits) and all the prerequisites. If you still can’t see XperiDo ribbon after that (other than those 3 buttons on the screenshot below):

image 

Try starting Word “as administrator” – that’s what solved the issue for me when everything else failed:

image

May also need to keep this in mind later since, every now and then, Word will open normally (there are other places you can open it from to edit the templates), and you won’t see the ribbon.

If none of that helps, try contacting XperiDo – you will likely get a follow-up email from their sales shortly after your trial starts, so you can just reply to that email. They do answer (although, I’m not sure how much effort XperiDo would really want to put into the “trial” support).

Why is it that every organization wants to have a unique client portal?

Isn’t it a legitimate question? If we look at it from the client perspective?

Personally, I don’t want to deal with a unique portal every time. I would rather have a single portal with all-familiar interface, with single sign on, and with all the details from various credit cards, banking accounts, loan accounts, tax account, cell phone accounts, internet accounts, and whatever other accounts being collected in one place.

Pretty sure I’m not alone in that.

And I understand that, traditionally, client portals are implemented as part of the organization web site for all the right reasons (branding, marketing, upsell, etc), but it just creates so much havoc. Different statement formats, different payment methods, different logins and passwords, different support channels..  When all I need is one portal where I can see all my information – pretty sure it should be possible to create such a thing (even if it were a paid subscription, I think many of us, “clients”, might agree to pay a small monthly fee for the convenience of having all that info in one place).

But how do you turn this ship around now when pretty much every organization has already invested into the development of their own portal?

Custom indexes and solution upgrades

In the on-prem version of Dynamics, we can create our own custom indexes. And, even though it’s a supported customization (here is a reference: https://msdn.microsoft.com/en-us/library/gg328350.aspx ), there is at least one scenario which may fail.

Basically, the problem is that custom indexes are not managed by Dynamics, so what it means is that any operation which is conflicting with such an index will fail.

For instance, when applying a solution upgrade to a managed solution, and when updated version of that managed solution does not have a field on which there is, already, a custom index, you will see this error in Dynamics:

image

The dreaded SQL Server Error..

In my case, this error occurred because of the custom index:

image

Once I have removed the index, solution upgrade worked just fine:

image

This does not seem to be  a problem for unmanaged solutions, but that’s expected since we can’t delete a field by importing an unmanaged solution. On the other hand, when importing unmanaged solutions, you may still change text field length for an existing field.. I did not try it, but I’m guessing Dynamics might not be able to publish those changes if the maximum size of all the columns  included into the index exceeds 900, since there is a limit on the SQL side.

Turns out business process flow entities have super powers

 

Did you know that, when setting up a workflow for a business process flow entity (Opportunity Sales Process, for example), we can configure that workflow to trigger on change of the related entities?

image

 

image

In the example above, we can trigger that workflow on change of the related opportunity entity fields etc. That’s an interesting feature which makes it possible to do this:

https://blogs.msdn.microsoft.com/crminthefield/2017/12/18/automate-business-process-flow-stages-using-workflows/#comment-48546

And which makes my older post on this topic pretty much outdated:

http://www.itaintboring.com/dynamics-crm/lets-rule-the-business-process-with-a-workflow/#comment-1361

The only caveat is that it has to be a background workflow since that functionality is not available for real-time workflow.

One thing I noticed when I tried it quickly is that there seem to be a bug – whenever I use “Select“ button to add more fields, I have to make sure I click “Save” and close the workflow designer window (or just “Save and Close”). Otherwise, if I click “Select” again to add more fields, all the selections I just made in the same designer “session” disappear:

image

And, then, click “OK”  and “Select” again:

image

But, if I click “Save and Close” first, then open the same workflow in the designer and click “Select”, everything looks good:

image

Dynamics and Machine Learning: Consuming the predictive web service

In the previous post, I have stopped short of creating a plugin to consume the predictive web service directly from Dynamics. It still needs to happen to complete the whole exercise,  so.. it’s time to do it.

First of all, when I looked at the Machine Learning Studio today, I could not get to the sample web service code right away. The easiest way I found after some digging around is this:

image

image

image

And, then, I got this:

image

Which is not exactly the same screen I was looking at yesterday.. but, anyway, that should do it.

Unfortunately, it did not take me long to find out that the sample code uses async methods and json, so it would take a bit of an effort to get it all into the plugin. Fortunately, I did not need to do it that way – after all, the same page has a sample of the json, so I ended up re-writing that code using WebClient and pre-formatted json string:

using System;
using System.Collections;
using System.Collections.Generic;
using System.IO;
using System.Net;
using System.Net.Http.Headers;

namespace MachineLearning.Plugins
{
     class PredictiveService
     {

        public static string GetBestAssignee(string customerName, string productName, string[] assignees)
         {
             string bestAssignee = null;
             decimal bestTime = 1000000000;
             foreach(var assignee in assignees)
             {
                 decimal assigneeTime = GetPrediction(customerName, assignee, productName);
                 if (bestTime > assigneeTime)
                 {
                     bestAssignee = assignee;
                     bestTime = assigneeTime;
                 }
             }
             return bestAssignee;
         }

        public static decimal GetPrediction(string customerName, string assigneeName, string productName)
         {
             decimal result = 10000000;

            var requestJson = @”{{
                   “”Inputs””: {{
                     “”input1″”: {{
                       “”ColumnNames””: [
                         “”title””,
                         “”CustomerIdName””,
                         “”ita_FirstAssigneeName””,
                         “”ResolutionTime””,
                         “”ProductIdName””
                       ],
                       “”Values””: [
                         [
                           “”Title””,
                           “”{0}””,
                           “”{1}””,
                           “”0″”,
                           “”{2}””
                         ]
                       ]
                     }}
                   }},
                   “”GlobalParameters””: {{
                     “”Data gateway””: “”””
                   }}
                 }}”;

            requestJson = string.Format(requestJson, customerName, assigneeName, productName);

            const string apiKey = “…”; // Replace this with the API key for the web service
             var authorizationHeader = new AuthenticationHeaderValue(“Bearer”, apiKey);

             var uri = “…”;// Replace this with the URL of the web service
             var cli = new WebClient();
             cli.Headers[HttpRequestHeader.ContentType] = “application/json”;
             cli.Headers.Add(“Authorization”, “Bearer ” + apiKey);
             string response = cli.UploadString(uri, requestJson);
             int i = response.LastIndexOf(“\””);
             if(i > -1)
             {
                 int j = response.LastIndexOf(“\””, i – 1);
                 response = response.Substring(j + 1, i – j – 1);
                 result = decimal.Parse(response);
                 if (result < 0) result = 10000000;
             }
             return result;

         }
     }
}

Tested that in a console application quickly:

image

And went on to writing the plugin. The plugin is supposed to do this:

  • Get all possible assignee names (user names), product name, and client name
  • Pass all that information to the service above
  • Set “First Assignee” field based on the outcome of the GetBestAssignee method call above

Here is the plugin code:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Xrm.Sdk;
using Microsoft.Xrm.Sdk.Query;

namespace MachineLearning.Plugins
{
    public class CaseFirstAssigneePlugin : IPlugin
    {

        
        public void Execute(IServiceProvider serviceProvider)
        {
            
            IPluginExecutionContext context = (IPluginExecutionContext)
                serviceProvider.GetService(typeof(IPluginExecutionContext));

            IOrganizationServiceFactory serviceFactory = (IOrganizationServiceFactory)serviceProvider.GetService(typeof(IOrganizationServiceFactory));
            IOrganizationService service = serviceFactory.CreateOrganizationService(context.UserId);

            var postImage = context.PostEntityImages["PostImage"];
            if(postImage.Contains("productid") && postImage.Contains("customerid"))
            {
                string productName = ((EntityReference)postImage["productid"]).Name;
                string customerName = ((EntityReference)postImage["customerid"]).Name;
                QueryExpression qe = new QueryExpression("systemuser");
                qe.ColumnSet = new ColumnSet("fullname", "systeuserid");
                var users = service.RetrieveMultiple(qe);
                Entity bestAssignee = null;
                decimal bestTime = 1000000000;
                foreach (var u in users.Entities)
                {
                        decimal assigneeTime = MachineLearning.Plugins.PredictiveService.GetPrediction(
                            customerName, 
                            (string)u["fullname"], 
                            productName);
                        if (bestTime > assigneeTime)
                        {
                            bestAssignee = u;
                            bestTime = assigneeTime;
                        }
                }
                if (bestAssignee != null)
                {
                    Entity updatedEntity = new Entity(postImage.LogicalName);
                    updatedEntity.Id = postImage.Id;
                    updatedEntity["ita_firstassignee"] = bestAssignee.ToEntityReference();
                    service.Update(updatedEntity);
                }
                
            }
        }
    }
}

And here is how it’s been registered in Dynamics:

image

That should be it.. Time for a quick test?

The plugin is only registered on Create of the case entity (should not be difficult to adjust that, but you’ll need to add some logic to the plugin to decide what to do on Update if a case has already been assigned), so, for the test, I’ll create a new Case, and I’ll set Product and Customer fields. Once I push “save” button, the plugin will kick in, and, once the form reloads, I should see “First Assignee” field populated.. Kind of a lot of writing, but it’s really just this:

Step 1 – Creating a new Case:

image

Step 2 – verifying that “First Assignee” field has been populated:

image

That’s it! I now have a super intelligent case assignment mechanism that is using a predictive web service to choose first assignee according to the best predicted case resolution time.

PS. You realize it’s been just a proof of concept, right?Smile Don’t be too harsh on the results of those predictions.. after all, the model has been trained on 10 completely made up cases to start with.

Dynamics: Using Machine Learning to predict case resolution time

Now that I have a way to import Dynamics data to the Machine Learning Studio, it would be great to see if I can use it to, well, start predicting something.

Here is the scenario I’m going to try:

In Dynamics, I have added a few fields to the out-of-the-box case entity:

  • Resolution Time (decimal)
  • First Assignee (user lookup)

Would it be possible to use Machine Learning with Dynamics in order to choose the best First Assignee for every new case?

In a nutshell, I am guessing it’s supposed to work this way:

  • First, I will import some test data to Dynamics
  • Then, I will create a model in the Machine Learning Studio and train it using the data from Dynamics
  • After that, I will create a web service
  • And, finally, I will create a plugin to consume that web service and to automatically assign a case to a user based on the web service predictions

 

For the test data, I simply used an excel spreadsheet to get 10 cases in Dynamics. You will see those cases on the screenshot below:

image

In the test data, I used 10000 for those cases which were closed as unresolved.

To set up the experiment in the Machine Learning Studio, I pretty much followed this link:

https://docs.microsoft.com/en-us/azure/machine-learning/studio/create-experiment

So, here is what the experiment looked like as a result:

image

Import data component was set up exactly the the way it was described in the previous post, except that I used a different SQL query this time:

SELECT
title,
CustomerIdName,
ita_FirstAssigneeName,
CAST(ita_resolutiontime * 100 AS INT) AS ResolutionTime,
ProductIdName
FROM Incident

Notice that I could not use my decimal field as is, so there is a cast to int.. Also, if you look at the experiment screenshot above, you’ll see “Select Columns in Dataset” component. That one is there to remove case title from the columns. I could have changed the query, but it’s a learning exercise for me, too, so I left the query as is, and, then, the reason I needed that “select columns” component is that, whatever the title is, it’s unlikely to affect case resolution time. So we can simply exclude it before training the model:

image

At this point, everything was ready, so I followed another article to create a web service:

https://docs.microsoft.com/en-us/azure/machine-learning/studio/publish-a-machine-learning-web-service

Ran the experiment, created a predictive Web Service:

image

That created a “predictive experiment”, so I ran that one.. and was, finally, able to create a web service:

image

Got a web service, so I can test it now:

image

image

 

image

Now I just need to create a plugin that will consume this web service and do case assignment based on the score.. This is for the next post, but, it seems, that part is going to be relatively straightforward – all the code required to consume this web service is available on the “consume” tab:

image

Importing (on-prem) Dynamics data to the Machine Learning Studio

Data Import component in the Machine Learning Studio supports on-premise gateways, so I got this idea that I should be able to use it to import data from the on-prem Dynamics database. Even more, I was actually hoping to use the gateway which I just installed with the Machine Learning, but, as it turned out, it was not the right gateway to start with.

When setting up “Import Data” component for the experiment in the Machine Learning Studio, there is an option to use a Gateway.

image

First of all, it turned out that option won’t work for the free tier, so I had to re-create the workspace under my Visual Studio Enterprise subscription to start with.

Now, when adding a new gateway, machine learning studio will offer you to download a gateway and to use a registration key to register it:

image

What I found out is that this gateway is different from the gateway I used before, so.. yes, I had to uninstall that first gateway and, instead, had to install a new one. Which is actually called “Microsoft Integration Runtime”:

image

This version of the “gateway” did ask me for the authentication key, so I used the one provided by the Machine Learning Studio:

image

Finally, got it registered:

image

At which point I started the configuration manager and tried testing the connection:

image

This did not work, but, after a bit of digging, it turned out this was happening because of the firewall on my SQL VM. Once I disabled the firewall, I could test the connection just fine. Ended up adding a rule for the default SQL port(1433):

image

After which I was, finally, able to test the connection.

Back to the Machine Learning Studio!

Actually, it did recognize the gateway automatically – once I switched back to the browser, I saw this screen:

image

So, I just had to specify the server name, configure the credentials, and, it seems, my data import component is ready:

image

Now, how do I test it?

There is “run selected” option in the popup menu on the import data component:

image

My first attempt to run it was a complete failure:

image

I can’t use decimals.. nice. Well, let’s change the query – at this point, I’d be happy if it worked at all.. even with only two fields: Name and StatusCode

image

This time “Run Selected” succeeded:

image

Once that worked, “Visualize” and “Save Dataset” options became available under the “Results Dataset”:

image

Selecting the “Visualize”.. and, finally, I have my on-prem Dynamics data in the Machine Learning Studio:

image

What am I going to do with this next is a different question.. but, at least, it’s a small step in the right direction.