Monthly Archives: December 2017

Developing the plugins – divide and conquer

 

One of the main problems with plugin development is that we have a very limited set of debugging techniques. We can use the Tracing Service, but it’s not, really, debugging.. It’s tracing. We can keep throwing errors from the plugin, but that’s just another form of tracing. We can use a plugin profiler, and that comes very close to debugging, but we have to install the profiler, have to capture the log, and have to believe we are looking at the right log. And that’s still not debugging of the original execution – it’s a debugging of the replay execution.

Those are all useful techniques, but, personally, I find it very helpful to think of plugin development as of a special kind of development exactly because of the lack of debugging tools. Basically, you want to do as little debugging as possible, so, ideally, any code you are testing should be obvious and clear enough for you to be able to fix it without having to use debugging tools.

In other words, it’s all about that old well-known “divide and conquer” principle. When working on a plugin, keep dividing the functionality into very small pieces, then start adding those individual pieces to the code, but don’t go to the next piece until the current one has been tested.

image

You want to do it line by line? Not a problem, whatever suits you.

For example, imagine you have to write an on Update plugin to check organization credit limit against some pre-configured setting, so, basically, you need to do this:

  • Get attributes from the plugin context Target
  • Run a query to retrieve pre-configured maximum
  • Do the comparison and throw an exception if required

 

If you write the whole plugin from top to bottom, you will likely have at least a few errors there, and you may have to debug your plugin to find out what went wrong.

If you did it in the “divide and conquer” way, you might do this:

  • Create a basic plugin with no functionality, deploy it in Dynamics
  • Add the code to get attributes from the target, use InvalidPluginExecutionException to quickly ensure that you got correct values, run a few tests (all values there, some values missing, etc)
  • Add code to run the query, use InvalidPluginExecutionException to ensure you have retrieved the right value from the query (again, try different scenarios to make sure it works as expected)
  • Finally, do the comparison and test the whole plugin

 

This may look like an overhead to a .NET developer who is used to debugging their apps in the Visual Studio, but that’s also the main reason I rarely need to use debugging tools when developing plugins, and that, in the end, simply saves me quite a bit of time.

Plugins recursion – how can you handle it?

 

What if you register a plugin step on update of some attribute, and, then, update the same attribute in the plugin again?

clip_image001

Dynamics can’t decide, for you, when is the right time to stop this endless loop, but it will detect the recursion, and, after a few repetitions, will stop it:

clip_image003

It will do the same in many different situations. It can be one plugin, or a chain of plugins, or a chain of workflows, or a mix of those. They can be synchronous or a-synchronous, they just have to be executing as a result of the same original operation. There are some finer settings (which you can’t easily change, especially in the online environments) which control when Dynamics will interfere, but, in the end, as soon as the number of those sequential plugins/workflows reach a certain limit within a pre-defined timeframe, you will get a similar error message.

There are a few optimizations/workarounds available to you:

– Always define specific triggering attributes for your steps. Don’t just leave “all attributes” there. That way, you will reduce the possibility of occasional recursion, but you won’t be able to completely avoid it

– You can use context.Depth property which is available in the plugin execution context. It will keep increasing for every new workflow/plugin in the “chain”. For every new “update”, Depth will start with 1. Then, if, inside the first plugin step you do something that start another plugin, that second plugin will have Depth=2 in the context. And it will keep increasing till it reaches the point where Dynamics will interrupt the process. Although, keep in mind that those have to be sequential events where one event is causing the other. If, for instance, there are two pre-operation steps on Update of the same entity, both responding to the update of the same attributes, “Depth” will be the same in those two steps(since, from the Depth perspective, they are “parallel”.. not sequential). This is why you will often see this condition in the plugins:

if (context.Depth > 1) return;

That works, but that introduces a hidden problem. Imagine that you have a plugin that’s doing something on update and that has that kind of condition. You have deployed the plugin, you have tested it by opening the entity in Dynamics, updating a field, and verifying that the plugin kicked in. It is depth 1, so it’s all good:

clip_image005

But what if we change it a little?

clip_image007

Now it’s, suddenly, Depth 2, and the plugin will not kick in when the update happens from a workflow/another plugin. Which you may or may not discover until such time when somebody has created a workflow, started to use it, and, then, somebody else pointed out that the data is not consistent anymore

– Do not make the beginner’s mistake

Never do this:

Entity entity = (Entity)context.InputParameters[“Target”];

service.Update(entity);

Instead, always create a new in-memory entity, assign the id, and set only the attributes you want to update:

Entity entity = (Entity)context.InputParameters[“Target”];

Entity updatedEntity = new Entity(entity.LogicalName);

updatedEntity.Id = entity.Id;

updatedEntity[“attribute”] = value;

service.Update(updatedEntity);

– Try not to call service.Update at all – in the pre-operation step, you can use Target to set the attributes. There is no need for a separate Update call, and, if there is no such call, there is no recursion at all:

Entity entity = (Entity)context.InputParameters[“Target”];

entity[“attribute”] = value;

Paging in FetchXml

Was working on the code that needed paging with FetchXml.. There is a good example on MSDN:

https://msdn.microsoft.com/en-us/library/gg328046.aspx

But, in case you don’t want to mess with XmlDocument (not that it’s really that messy), you can easily do the same using simple string Replace – there is an example below. The only tricky part is that a few characters in the paging cookie have to be decoded before you can put it into the FetchXml, but that’s about it:

        public EntityCollection RetrieveContacts(IOrganizationService service, int page, string pagingCookie)
        {
            if (pagingCookie != null && pagingCookie != "")
                pagingCookie = pagingCookie.Replace("\"", "'").Replace(">", "&gt;").Replace("<", "&lt;");
            string fetchXml =
                    @"<fetch version=""1.0""
                          count=""25""
                          page=""{0}""
                          paging-cookie=""{1}""
                          returntotalrecordcount=""true""
                          output-format=""xml-platform""
                          mapping=""logical""
                          distinct=""false"">
                        <entity name=""contact"">
                          <attribute name=""contactid"" />
                        </entity>
                    </fetch>";
            fetchXml = string.Format(fetchXml, page, pagingCookie);
            var qe = new FetchExpression(fetchXml);
            var result = service.RetrieveMultiple(qe);
            return result;
        }

                int pageNumber = 1;
                 string pagingCookie = “”;
                 EntityCollection result = null;
                 do
                 {
                     result = RetrieveContacts(service, pageNumber, pagingCookie);
                     foreach (var e in result.Entities)
                     {
                         …
                     }
                     pagingCookie = result.PagingCookie;
                     pageNumber++;
                 } while (result.MoreRecords);






Dates in Dynamics: TimeZone Independent vs User Local

Did you ever have to look into the DateTime fields behaviour in Dynamics? DateTime fields can be “User Local”, “Date Only”, and “Time Zone independent”:

https://technet.microsoft.com/en-us/library/dn946904.aspx

If what you see at that link does not look 100% clear (and, to me, it did not..  until a few days ago), let me try to explain it from a little different perspective.

See, there are two sides to it:

image

Yeah.. You could have guessed – it’s Dynamics, and it’s SQL.

On the SQL side, there is a DateTime column:

image

What happens to that column in Dynamics depends on the behavior.

If it’s “User Local”, that timestamp will show up in the Dynamics interface in the user timezone. A Dynamics user in EST timezone, for example, would see it as 7:28 AM.

If it’s “Time Zone Independent”, it would show up in the Dynamics interface as is. In other words, it would be 12:28 PM for a user in ANY timezone.

If it’s “Date Only”, it’ll actually show up as Date Only: 12/10/2017 for a user in ANY timezone.  That’s why, if you create a DateTime field with Date Only behavior, it will be stored in SQL with 00:00 for time.

But that’s where you may see some interesting side effects if you ever have to change the behavior of an existing field.

You can’t do it for “created on”, but imagine your own Date and Time, User Local field. If you go to Dynamics to change the behavior of that field to “Time Zone Independent”, for example:

image

This is going to change how Dynamics will be treating the field, but not what will be stored in SQL for the existing data. If there were timestamps for 12/10/2017 12:58PM, they will not be updated. If there were timestamps for 12/10/2017 4:00 AM, they will not be updated.

However, for a user in EST, that 4:00 AM used to show up as 12/9/2017 23:00 PM before the change (UTC-5). After the change, it will be showing up as 12/10/2017 4:00 AM (no conversion from what’s stored in SQL).

Hope this helps.

Happy 365-ing!

Dynamics Plugin Developer Training

I’ve been working on the Dynamics Plugin Developer Course lately, and, it seems, it’s finally shaping up:

Plugin Developer Course

At this point, I may need a few “testers”  (although, guinea pigs might be a better name for what you are going to be up to since I really need to test the whole model on someone:) ). If you are a Dynamics functional consultant looking to add some dev skills to your profile, or if you are a .NET developer now working on a Dynamics project.. or if you are just trying to figure out how to develop Dynamics plugins purely out of curiosity, have a look and let me know if you’d be interested.

Dynamics 365: The craziest thing I learned lately..

You know how we’ve always been looking for ways to export data from Dynamics, and that’s never been a simple task. There are reports, there are templates, there are plugins, workflows, and custom actions.. But how about this (hint – it’s all javascript):

Export data from Dynamics

Export data from Dynamics

This is something that I learned from the guys at MNP (former “A Hundred Answers”). And the technique itself is pretty well documented here:

https://stackoverflow.com/questions/14964035/how-to-export-javascript-array-info-to-csv-on-client-side

When applied to Dynamics, here is how it works step by step, and I will provide a link to the demo solution below:

  • We can use javascript to download an “in-memory” csv file from the browser.  Open the script above, do a search for exportToCsv, and you’ll get the idea how it’s done.
  • In the V9, we can use Xrm.WebApi.retrieveMultipleRecords to get the list of records
  • So all we really need to do is to combine the two items above. We can use this with a ribbon button, or we can just add a web resource with a regular html button in it (that’s what I did in the demo solution below)

 

You can download the demo solution here: http://itaintboring.com/downloads/DownloadScriptDemo_1_0_0_0.zip

Once you get the solution, create a DD Company record, add a few DD Location records, and click export button (have a look at the screenshots above) to see how it works.

 

 

You will see the web resource below, and it’s somewhat hardcoded to work with the ita_ddcompany and ita_ddlocation entities, but it should not be difficult to make it work with other entities. Although, it’s worth mentioning a few things about the javascript there:

  • Xrm.WebApi.retrieveMultipleRecords would not work in the html web resource just like that. It’s expecting to have access to the jQueryApi object which seem to be created while the form is being loaded. Unfortunately, it’s not available in the ClientGlobalContext.js, so, there is get_jQueryApi function in that javascript below. It works, even though I’m not sure if it’s really how we should be doing it.
  • I am getting access to the record id through the parent property, and that’s another one of those technique which we might better avoid. I could probably just pass that ID to the web resource, but it’s a demo script.. it works, and that’s all I wanted for now.

 

 

<html><head>
<style>
body,html
{ 
   margin: 0px,
   padding: 0px
}
</style>
    
    <script src="../ClientGlobalContext.js.aspx" type="text/javascript"></script>
<script>
  function exportToCsv(filename, rows) {
        var processRow = function (row) {
            var finalVal = '';
            for (var j = 0; j < row.length; j++) {
                var innerValue = row[j] === null ? '' : row[j].toString();
                if (row[j] instanceof Date) {
                    innerValue = row[j].toLocaleString();
                };
                var result = innerValue.replace(/"/g, '""');
                if (result.search(/("|,|\n)/g) >= 0)
                    result = '"' + result + '"';
                if (j > 0)
                    finalVal += ',';
                finalVal += result;
            }
            return finalVal + '\n';
        };

        var csvFile = '';
        for (var i = 0; i < rows.length; i++) {
            csvFile += processRow(rows[i]);
        }

        var blob = new Blob([csvFile], { type: 'text/csv;charset=utf-8;' });
        if (navigator.msSaveBlob) { // IE 10+
            navigator.msSaveBlob(blob, filename);
        } else {
            var link = document.createElement("a");
            if (link.download !== undefined) { // feature detection
                // Browsers that support HTML5 download attribute
                var url = URL.createObjectURL(blob);
                link.setAttribute("href", url);
                link.setAttribute("download", filename);
                link.style.visibility = 'hidden';
                document.body.appendChild(link);
                link.click();
                document.body.removeChild(link);
            }
        }
    }
   var jQueryApi = null;
   function get_jQueryApi()
   {  
        var p = parent;
        while(p != null)
        {
           if(typeof p.jQueryApi != 'undefined') {
               jQueryApi = p.jQueryApi;
               return;
           }
           if(p == p.parent) return;
           p = p.parent;
        }
   }

   function get_ParentId()
   {  
      return parent.Xrm.Page.data.entity.getId().replace('{', '').replace('}', '');
   }

   function downloadRelationships()
   {
      get_jQueryApi();
      Xrm.WebApi.retrieveMultipleRecords("ita_ddlocation", "?$select=ita_name&$top=10&$filter=ita_ddcompanyid/ita_ddcompanyid eq " + get_ParentId(), 3).then( 
             
                function success(result) {
                  
                  var data = [];   
                  for (var i = 0; i < result.entities.length; i++) {
                     var row = [];
                     row.push(result.entities[i].ita_name);
                     row.push(result.entities[i].ita_ddlocationid);
                     data.push(row);
                     console.log(result.entities[i]);
                  }
                  exportToCsv("data.csv", data);           
                  // perform additional operations on retrieved records
               },
              function (error) {
                 console.log(error.message);
                 // handle error conditions
             }
      );

   }
</script>
<meta><meta><meta><meta><meta><meta><meta><meta>
<button onclick="downloadRelationships();" name="Export" text="Export">Export</button>


Hope you liked it, too.. Happy 365-ing!