Monthly Archives: January 2020

N:N Lookup on the new record form? Let’s do it!

It was great to see how N:N lookup PCF control has sparked some interest, but there are still a few things that could(and probably should) be added.

For example, what if I wanted to make it work when creating a new record? Normally, a subgrid won’t event show up on the new record form. But, in the updated version of the N:N lookup, it’s actually possible now:

ntonmultiselectoncreate

So, where is the catch?

The problem there is that there is no way for the PCF control to associate anything to the record being created, since, of course, that record does not exist yet. But, I thought, a “post operation” plugin would certainly be able to do it:

 

image

If you wanted to try it, here is what you need:

NToNMultiSelect control has been updated, too

You can use the same approach with any entity, just keep in mind a few things.

NToNMultiSelect is supposed to be bound to a single line text control. I should probably change this to “multiline”, but, for now, that’s what it is. Since this control is passing JSON data through that field, the field should be long enough (2000 characters). Yes, there is still room for improvement.

Also, you will need to register a plugin step on each entity which is using this control:

image

It should be registered in the PostOperation, and it should be synchronous.

The plugin will go over all the attributes, and, if any of them includes data in the required format, it will parse the data, and it will create required record associations.

That’s it for today – have fun with the Power! (just testing a new slogan hereSmile )

Is it a multiselect optionset? Nope… it’s an N:N lookup

If you ever wanted to have your own multiselect optionset which would be utilizing an N:N relationship behind the scene, here you go:

ntonmultiselect

It works and behaves similarly to the out-of-the-box multiselect optionset, but it’s not an option set. It’s a custom PCF control that’s relying on the N:N relationship to display those dropdown values and to store the selections.

Turned out it was not even that difficult to build this – all that was needed is to combine Select2 with PCF

The sources (and the solution file) are on github: https://github.com/ashlega/ITAintBoring.PCFControls

This is the first version, so it might not be “final”, but, so far, here is how this control is supposed to be configured:

  • You can use it for any single line text control
  • There are a few properties to set:
  • image

Linked Entity Name: “another” side of the N:N

Linked Entity Name Attribute: usually, it would be “name” attribute of the linked entity

Linked Entity ID Attribute: and this is the “Id” attribute

Relationship Name: this is the name of the N:N relationship (from the N:N properties page)

Relationship Entity Name: this is the name of the N:N relationship entity name (from the N:N properties page)

Some of those properties could probably be retrieved through the metadata requests, but, for now, you’ll just need to set them manually when configuring the control.

PS. There is more to it now (Jan 31): https://www.itaintboring.com/dynamics-crm/nn-lookup-on-the-new-record-form-lets-do-it/

2020 Release Wave 1 – random picks

Looking at the 2020 Release Wave 1 features, it’s kind of hard to figure out which ones will be more useful. Somehow, all of those I’ve read through so far seem to have the potential to strike a chord with those working with Power Platform / Dynamics 365, so it’s going to be a very interesting wave.

Here are just a few examples:

Enabling printable pages in canvas apps

“Makers are able to configure a printable page in their canvas apps, taking the content on the screen and turning it into a printable format (PDF)”

I was talking about it to the client just the other week – they wanted to know if there is a way to print a Canvas App form. It’s still not exactly around the corner, since public preview of this feature is coming in July 2020, but for a lot of enterprise projects this is, actually, not too far away.

General availability for large files and images is coming in April 2020

Are you still not comfortable with Sharepoint integration for some reason and need a way to link large files directly to the records in CDS? There you go:

image

Forms displayed in modal dialogs

Do you want that command bar button to open a dialog before you deactivate a record? Or, possibly, before you close a case?

You will be able to open regular forms in the modal popup dialogs now. This kind of functionality is something we’ve been asking about for years:

“Users do not have to navigate away from a form to create or edit a related record. This greatly improves productivity by reducing clicks and eliminating the need to do unnecessary navigation back and forth across forms.”

image

Actually…

There is going to be a configurable case resolution page in Wave 1

“Choose between the non-customizable modal dialog experience (default setting) and the customizable form experience”

Will be it based on the modal dialog forms mentioned above? We’ll see soon, I guess.

“Save” button is back

It’s not hiding down there anymore – it’s back at the top (although, I think it’s down there as well.

Btw, technically, the description given in the release plan is not 100% correct: “Before this release, if the auto save option was turned on, both options were hidden and not available in the command bar”

See, in the releases that might now be long forgotten, “save” button was always visible at the top. I guess the good things are coming back sometimesSmile

License enforcement for Team Member licenses

Team member licenses have always been a problem because somewhat vague language around them could not stop people from trying to utilize those licenses. After all, the price could be really attractive.

Now that Power Apps have there own $10 license, and, so, Team Member license only makes sense for Dynamics 365, license enforcement will be coming in.

image

Why do I think it’s a good thing? Well, that’s because it brings certainty and leaves no room to the interpretation. The clients won’t be at risk of violating the license terms once those terms are, actually, enforced.

There is more….

Flow steps in business process flows, secrets management in the flows, etc etc

Have a look for yourself:

2020 Release Wave 1 for Power Platform

2020 Release Wave 1 for Dynamics 365

Lookup filtering with connection roles

Here is what I wanted to set up today:

There is a custom SkillSet entity that has an “Advisor” field. That field is a lookup to the out-of-the-box contact entity. However, unlike with a regular lookup, I want that “Advisor” field to only display contacts which are connected to the current skillset through the “Expert” connection role.

In other words, imagine I have the skillset record below, and it has a couple of connected contacts (both in the “Expert” role):

image

I want only those two to show up in the lookup selector when I am choosing a contact for the “Advisor” field:

image

Even though there are, of course, other contacts in the system.


Actually, before I continue, let’s talk about connections and connection roles quickly. There is not a lot I can say in addition to what has already been written in the docs:

https://docs.microsoft.com/en-us/powerapps/maker/common-data-service/configure-connection-roles

Although, if you have not worked with the connections before, there is something to keep in mind

Connection roles can connect records of different types, but there is neither “source” nor “target” in the role definition

It’s not as if there were a source entity, a target entity, and a role. It’s just that there is a set of entities, and you can connect any entity from that set to any other entity in that set using your connection role:

image

Which may lead to some interesting effects – for example, I can have a SkillSet connected to a Contact as if that SkillSet were an expert, which does not really make sense:

image

But, of course, I can get a contact connected to a skillset in that role, and that makes much more sense:

image

 


That’s all great, but how do I filter the lookup now that I have an “Expert” role, and there are two contacts connected to the Power Platform skillset through that role?

That’s where we need to use addCustomView method

Why not to use addCustomFilter?

The first method (addCustomView) accepts complete fetchXml as one of the parameters, which means we can do pretty much anything there. For example, we can link other entities to define more advanced conditions.

The second method (addCustomFilter) accepts a filter to be applied to the existing view. We cannot use this method to define a filter on the linked entities.

In case with the connections, what we need is a view that starts with the contacts and that only displays those which are connected to the selected SkillSet record in the “Expert” role like this:

image

So… You will find a link to the github repo below, but here is the script:

function formOnLoad(executionContext)
{

	var context = executionContext.getFormContext();
	
	var viewId = "bc80640e-45b7-4c51-b745-7f3b648e62a1";
	var fetchXml = "<fetch version='1.0' output-format='xml-platform' mapping='logical' distinct='true'>"+
	  "<entity name='contact'>"+
		"<attribute name='fullname' />"+
		"<attribute name='telephone1' />"+
		"<attribute name='contactid' />"+
		"<order attribute='fullname' descending='false' />"+
		"<link-entity name='connection' from='record2id' to='contactid' link-type='inner' alias='ce'>"+
		  "<link-entity name='connectionrole' from='connectionroleid' to='record2roleid' link-type='inner' alias='cf'>"+
			"<filter type='and'>"+
			  "<condition attribute='name' operator='eq' value='Expert' />"+
			"</filter>"+
		  "</link-entity>"+
		  "<link-entity name='ita__skillset' from='ita__skillsetid' to='record1id' link-type='inner' alias='cg'>"+
			"<filter type='and'>"+
			  "<condition attribute='ita__name' operator='eq' value='" + context.getAttribute("ita__name").getValue() + "' />"+
			"</filter>"+
		  "</link-entity>"+
		"</link-entity>"+
	  "</entity>"+
	"</fetch>";
	
	var layoutXml = "<grid name='resultset' object='2' jump='fullname' select='1' preview='0' icon='1'>"+
	  "<row name='result' id='contactid'>"+
		"<cell name='fullname' width='300' />"+
	  "</row>"+
	"</grid>";
	
    context.getControl("ita__advisor").addCustomView(viewId, "contact", "Experts", fetchXml, layoutXml, true);
}

 

What’s happening in the script is:

  • It defines fetchXml (which I downloaded from the Advanced Find)
  • It dynamically populates skillset name in the fetchXml condition
  • Then it defines layout xml for the view. I used View Layout Replicator plugin in XrmToolBox to get the layout quickly:
  • image
  • Finally, the script calls “addCustomView” on the lookup control

 

And, of course, that script has been added to the “onLoad” of the form:

image

Now, I used connections above since that’s something that came up on the current project, but, of course, the same technique with custom views can be applied in other scenarios where you need to create a custom lookup view.

Either way, if you wanted to try it quickly, you will find unmanaged solution file in the git repo below:

https://github.com/ashlega/ItAintBoring.ConnectionRoleFilteredLookup

Have fun!

Power Platform Admin vs Dynamics 365 Admin

If you’ve been using Dynamics 365 Admin role to delegate Dynamics/Power Platform admin permissions to certain users, you might want to have a look at the Power Platform Admin role, too, since it may work better in some cases.

The main difference between those two roles is that you may need to add Dynamics 365 Admins users to the environment security group in order to let them access the environment, whereas you don’t need to do it for the Power Platform Admins:

https://docs.microsoft.com/en-us/power-platform/admin/use-service-admin-role-manage-tenant

image

Here is a quick illustration of how it works:

1. New user, no admin roles

No environments are showing up in the admin portal:

image

2. Same user, Power Platform Admin role

Six environments are showing up:

image

3. Same user, Dynamics 365 Admin role

Only five environments are showing up now since the 6th one has a security group assigned to it, and my user account is not included into that group:

image

Still, both roles are available and it may probably make sense to use Dynamics 365 Admin in those situations when you want to limit permissions a bit more. Although, the whole reason for this post is that we have found it a little confusing that such users must still be added to the environment security group, and, for us, it seems switching to Power Platform Admin might make this a little more straightforward.

Reactivating a classic workflow that’s part of a managed solution

Managed solutions are recommended for production, and we’ve been using them lately without much troubles, but, occasionally, something does come up.

One of the solutions had a workflow which required reference data. So it should not have been included into that solution to start with, but, since it was, it could not be activated.

We’ve got the reference data deployed, and I was trying to activate the workflow… when I ran into the error below:

image

As it often happens, the error is not too helpful:

“Action could not be taken for few records before of status reason transition restrictions. If you contact support, please provide the technical details”.

Apparently it’s talking about the status reason transitions… that kind of throw me off at first, since I thought I just can’t reactivate “managed” workflows at all for some reason. That might be a bummer for sure.

Well, turned out there is still a way. As it’s been for a while, if you can’t do something from the managed solution, try doing it from the default solution. Worked like a charm in this case, too, and I got my workflow activated.

But, of course, I should not have had this problem to start with if I put all those workflows in the right solutions and did my deployment in the right order. Still… If there is a third-party solution in the system, it might be helpful to know that what’s been deactivated can still be reactivated. As long as it’s done from the default solution.

Word Templates and default printer

Have you ever used a Word Template in Power Apps?

Choose Word Template and select entity

If you have not, have a look at this documentation page. For the model-driven apps, it’s one of the easiest ways to quickly create standardized word documents for your entities.

Although, Word templates do come with some limitations – I won’t go into the details here since it’s not what this post is about. I use Word templates occasionally, and they work great where I don’t hit those limitations.

This was one of the projects where Word templates seemed to fit great. We had a few different document to print, there were not deep relationships to display, we could live with no conditional logic, etc. And, then, just about the time we were supposed to go live, one of the business folks was looking at it and posed a very interesting question:

“So, do I have to remember to switch the printer every time I use this?”

See, for some of the records, there would be more than one template, and they would have to be printed on different printers. One of the printers would be a regular printer, but the other one would be a plastic card printer. And, yes, if somebody did send a 10-pages long regular document to the card printer, that would be a waste of plastic cards. The opposite of that would be sending a card template to the regular printer, but that’s much less problematic.

Seems simple, right? Let’s just set the default printer and be done with it, or, at least, so I thought.

Unfortunately for us, Microsoft Word (2016 in our case) turned out to be more optimized that expectedSmile

If you have 2 printers, and if you set one of those as the default printer, you would probably expect the default printer to be selected by default?

image

The way it works, though, is:

  1. Imagine you’ve opened a document in Word
  2. Then you printed that document to a non-default printer
  3. Then you opened another document in a different Word window
  4. And you are trying to print that second document

The printer you’ll see selected by default is the same printer that you used for the first document:

image

Isn’t that awesome? You don’t need to choose, that’s the one you used before… except that we’d just waste a bunch of plastic cards in our scenario.

The problem seems to be related to the fact that there is only one winword process, no matter how many word documents you have open on the screen:

image

And, it seems, it’s that process that’s actually storing “current” printer selections for the user.

So, how can we work around this performance optimization in Microsoft Word?

We have to close all word windows, then the process is unloaded from memory, and the next time we open a document in Word and try sending it to the printer, Word will be using default printer again:

image

I wish there were a setting somewhere…

Well, there are articles suggesting to use macros in this scenario to choose the printer, but, since it’s a word template, and since there will be different users even on the same “terminal” machine, I am not sure how well this will work and if it will work at all. Might still need to try.

Power Platform Dataflows vs … Taking a cruise to see Microsoft cloud ETL/ELT capabilities

Sometimes I think that Microsoft Cloud is not quite a cloud – it’s, actually, more like an ocean (which is, probably, similar to how things are with other “clouds” to be fair).

As an on-premise consultant, I did not use to appreciate the depth of Microsoft cloud at all. As a Power Platform consultant, I started to realize some of the extra capabilities offered by the Power Platform, such as:

  • Canvas Applications
  • Power Automate Flows
  • Different licensing options (can be good and bad)
  • Integration with Azure AD

 

Yet I was suffering quite often since, you know, there is “no way I can access the database”.

And, then, I tried the Dataflows recently. Which took me on a little different exploration path and made me realize that, as much as I’m enjoying swimming in the familiar lake, it seems there is so much more water out there. There is probably more than I can hope to cover, but I certainly would not mind going on a cruise and see some of it. So, this post is just that – a little cruise into the cloud ETL/ELT capabilities:

image

And, by the way, normally, you don’t really do deep diving on a cruise. You are out there to relax and see places. Here is the map – there will be a few stops, and, of course, you are welcome to join (it’s free!):

image

Stop #1: On-Premise ETL tools for Dynamics/Power Platform

If you have not worked with Dynamics on-premise, and I am assuming it’s about time for the pure-breed cloud consultants to start showing up, on-premise ETL tools might be a little unfamiliar. However, those are, actually, well-chartered waters. On-premise ETL tools have been around for a long time, and, right off the top of my head, I can mention at least a few which I touched in the past:

  • SSIS
  • Scribe(now Tibco – thank you Shidin Haridas for mentioning they were acquired)
  • Informatica

 

They all used to work with Dynamics CRM/Dynamics 365 just fine. Some of them turned into SAAS tools (Scribe online, for example), and some of them took a different route by merging into the new cloud tools (SSIS). Either way, in order to use those tools we had to deploy them on premise, we had to maintain them, we had to provide required infrastructure, etc. Although, on the positive side, the licensing was never about “pay per use” – those tools were, usually, licensed per the number of connections and/or agents.

We are still just near the shore, though.

Stop #2: PowerPlatform ETL capabilities

This is where we are going a little beyond the familiar waters – we can still use those on-premise ETL tools, but things are changing. Continuing the analogy, the cruise ship is now somewhere at sea.

Even if you’ve been working with the Power Platform for a while now, you might not be aware of the ETL capabilities embedded into the Power Platform. As of now, there are, actually, at least 3 options which are right there:

 

And, of course, we can often still use on-premise tools. After all, we are not that far from the shore. Though we are far enough for a bunch of things to have changed. For example, this is where an additional Power Platform licensing component kicks in since Power Apps licenses come with a certain number of allowed API calls.

Still, why would I call out those 3 options above? Technically, they are offering everything you need to create a ETL pipeline:

  • A schedule/a trigger/manual start
  • A selection of data sources
  • A selection of data destinations

 

Well, data lake export is special in that sense, since it’s hardwired for the CDS to Azure Data Lake export, but, when in the cloud, that’s an important route, it seems.

How do they compare to each other, though? And, also, how do they compare to the on-premise ETL tools (let’s consider SSIS for example):

image

The interesting part about Data Lake Export is that it does not seem to have any obvious advantages over any of the other tools EXCEPT that setting up CDS to Data Lake export looks extremely simple when done through “data lake export”.

Stop #3: Azure Data Factory

Getting back to the analogy of Azure being the ocean, it should not surprise you that, once in the ocean, we can probably still find the water somewhat familiar, and, depending on where we are, we might see familiar species. Still, the waters are certainly getting deeper, and there can be some interesting ocean-only life forms.

Hey, there is one just off the port side… Have you seen Azure Data Factory? That’s a real beast:

image

This one is strong enough to survive in the open waters – it does not care about Power Platform that much. It probably thinks Power Platform is not worth all the attention we are paying it, since here is what Azure Data Factory can offer:

image

  • It has data flows to start with
  • It can copy data
  • It has connectors
  • It has functions
  • It has loops
  • It is scalable
  • Pipeline designer looks somewhat similar to SSIS
  • It can actually run SSIS packages
  • It allows deployment of self-hosted(on-premise) integration runtime to work with on-premise data
  • It offers pipeline triggers
  • If has the ability to create reusable data flows
  • It has native support for CI CD (so, there is dev-test-prod)

 

And I think it has much more, but, well, it’s a little hard to see everything there is to it while on a cruise. Still, this screenshot might give you an idea of what it looks like:

image

In terms of data transformations, it seems there is a lot more one can do with the Data Factory than we can possibly do with the Dataflows/Data Lake Export/Power Automate Flows.

Although, of course, Data Factory does not really care about the Power Platform (I was trying to show it Power Platform solutions, and it just ignored them altogether. Poor thing is not aware of the solutions)

Finally, going back and relaxing in the sun…

image

It’s nice to be on a cruise, but it’s also great to be going home. And, as we are returning to the familiar Power Platform waters, let’s try putting all the above in perspective. The way I see it now, and I might be more than a little wrong, since, really, I did not have an opportunity to do a deep dive on this cruise, here is how it looks like:

  • SSIS will be becoming less and less relevant
  • Azure Data Factory will take over (probably has already done it)
  • Power Platform’s approach is almost funny in that sense. And, yet, it’s extremely useful. Following the familiar low code/no code philosophy, Power Platform has introduced its own tools. Which often look like simplified (and smaller) versions of their Azure counterparts, but which are meant to solve common Power Platform problems, and which are sometimes optimized for the Power Platform scenarios (environments, solutions, CDS data source, etc). The funny part there is that we, Power Platform consultants, are treated a little bit like kids who can’t be trusted with the real things. But, well, that approach does have some advantages:)

 

Power Platform dataflows

Have you tried Power Platform dataflows yet?

image

I would not be too surprised if you have not – I had not tried them until this weekend either. Might not have completely figured them out yet, but here is a quick rundown so far.

Basically, a data flow is a ETL process that takes data from the source, uses Power Query to transform it, and places this data in one of the two possible destinations:

image

Among those sources, there are some really generic ones – you can use Web API, OData, JSON, XML… They can be loaded from OneDrive, they can be loaded from a URL, etc:

image

For the Power Automate/Power Apps folks reading this – Data Flows are not using all the familiar connectors you may be used to when creating power automate Flows, for instance. As I understand it, Data Flows cannot be extended by throwing in yet another data source in the same way you would do it for Power Automate, for example. Although, since there are those generic “Web API/OData” sources, the extensibility is still there.

However, Data Flows did not start in Power Platform – they were first introduced in Power BI. There is a great post that explains why there were introduced there:

https://powerbi.microsoft.com/fr-fr/blog/introducing-power-bi-data-prep-wtih-dataflows/

Previously, ETL logic could only be included within datasets in Power BI … Power BI dataflows store data in Azure Data Lake Storage Gen2”

In other words, the problem Data Flows meant to solve in the Power BI world was about doing all that data transformation work outside of the Power BI dataset to make it much more reusable.

Power Platform dataflows seems to be doing exactly the same, although they can also store data in the Common Data Service. Actually, by default they will target Common Data Service. If you choose “Analytical entities only”, you’ll get data stored in Azure Data Lake Storage Gen2:

image

But what if you wanted to move data from CDS to Azure Data Lake Storage Gen2? Potentially (and I have no tried), you can probably choose “Analytical entities only” on the screenshot above, and, then, connect to CDS using Web API, then move that data to the data lake.

There is another option in the Power Platform which is called Export to Data Lake:

image

There is some initial setup, but, once it’s all done, you can enable CDS entities for export to data lake:

image

Important: don’t forget to enable Change Tracking on your CDS entity if you want it to show up on the list above.

So, with all the above in mind, here are two other facts / observations (in no particular order):

  • When setting up a data flow, you need to configure refresh frequency. For the data lake “target”, you can refresh target dataset up to 48 time per day. It seems there is no such limitation for CDS.
  • “Export to data lake” works somewhat differently from a regular data flow. It does create files for the records, but it also creates snapshots. The snapshots are not updated at once – they are updated with certain frequency (about 1 hour?)

 

Notice how, in the storage explorer,  I have snapshots dated Jan 11:

image

However, contacts files for 2018 has already been updated on Jan 12:

image

Have a look at the following post for a bit more details on this:

https://powerapps.microsoft.com/en-us/blog/exporting-cds-data-to-azure-data-lake-preview/

Compare those screenshots above to a regular Data Flow which has been configured with 1 minute refresh frequency (and, therefore, which has stopped to run because of the 48 runs per day limitation):

image

As you can see, there is a snapshot every minute, at least for as long as the data flow kept running.

Compose action, dynamic content, and data conversions

Earlier today, a colleague of mine who tends to spend his days developing Power Automate Flows lately showed me something that seemed confusing at first. Now having dug into it a bit more I think it makes sense, but let’s see what you think.

Here is a Flow where I have an HTTP Trigger, a Compose action, Initialize Variable action, and “send an email” action:

image

When trying to add dynamics content to the email body, I see Compose action outputs, but I don’t see the variable. I also see “name” and “value” from the http request json payload.

What’s interesting about all this is that:

  • Presumably, email “body” is of “string” type
  • Compose action is of “any” type
  • “Name” and “Value” are of “string” type, too

 

As for the email “body”, I am not really sure how to verify the type there, but it’s a reasonable assumption.

I was not able to find  that statement about “Compose” action in the PowerAutomate documentation, but here is what Logic Apps documentation has to say:

https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-workflow-actions-triggers#compose-action

image

As for the Http request, here is the schema:

image

So, what if I changed the type of my variable to make it a “string”? I would not be able to use “body” from the Dynamic content to initialize it:

image

BUT. I would be able to use that variable for the email body:

image

Or I could just use “Compose”, since, the way I see it, it can take “any” type for input, and it produces “any” type for output. Which makes it compatible with any other type, and which is different from variables, since they are actually strongly-typed.

PS. Of course I might also use triggerBody() function to achieve the same result without using Compose, but what would I write about then?Smile