Author Archives: Alex Shlega

Custom connector: where PowerAutomate makes peace with Logic Apps

Remember this screenshot?

Actually, other than Azure Functions and CDS custom actions, there is at least one other option in Power Platform which we can use to add custom code to our Power Automate Flows and/or to our Power Apps.

Those are custom connectors.  We can also use custom connectors with Logic Apps, so this is where all those Azure technologies are becoming equal in a way. Although, while Flows and Power Apps can only use REST API-s, Logic Apps can also use SOAP web services. Which gives Logic Apps a little edge, but, well, how often do we use SOAP these days?

Either way, the problem with custom connectors is that creating them is not quite as simple as creating an Azure Function or a CDS custom action.

Here is how the lifecycle of custom connectors looks like:

image

Source: https://docs.microsoft.com/en-us/connectors/custom-connectors/

The last two steps on this diagram are optional. As for the first three, the reason those first 3 steps can be quite challenging is that there are various options we have to create an API, to secure it, and to host it somewhere.

Still, what if I wanted to create a simple custom connector to implement the same regex matching that I used in the previous posts for Azure Functions and CDS Custom Actions?

I could create a Web API project in the Visual Studio. There is a tutorial here:

https://docs.microsoft.com/en-us/aspnet/core/tutorials/first-web-api?view=aspnetcore-3.1&tabs=visual-studio

In the remaining part of this post, I’ll show you how it worked out, and, if you wanted to get the source code for that regex Web API from github, here is a link:

https://github.com/ashlega/ItAintBoring.PowerPlatformWithCode

Essentially, it’s the same regex code I used for the Functions and/or for the CDS custom actions:

image

I can try this in Postman(hiding the actual link since, depending on where I leave it, it might not be protected with any kind of authentication. You can just publish that web api from github in your tenant to get your own link):

image

And the result comes back (although, compared to the other versions, it’s now in json format):

image

Let’s turn this into a custom connector?

There is a tutorial here: https://docs.microsoft.com/en-us/connectors/custom-connectors/define-blank

But, either way, let’s see how to do it for the regex connector above.

In the power apps maker portal, choose custom connectors area:

https://make.powerapps.com/

image

Creating a connector from scratch:

image

image

 

image

When importing from sample, make sure to specify full url. This feels strange, since I would assume with the base url specified before there would be no need to specify complete path to the api below, but it just has to be there. So, here we go (nothing goes to the headers btw):

image

With the response, there is no need to provide urls etc – just provide a sample response:

image

Once the response has been imported, for some reason nothing actually changes on the screen – there is no indication that a response has been added, but it’s there:

image

You can click on that “default” above, and you’ll see it:

image

Actually, the connector is almost ready at this point and we just need to create it:

image

And then it’s ready for testing:

image

When creating a new connection above, you will probably find yourself taken away from the “custom connector” screens. So, once the connection has been created, go back to the “custom connectors” areas, chose your connector, open it for “edit”, and choose the newly created connection:

image

Then we can finally test it:

image

And we can use this new connector in the Flow:

image

Apparently, it works just fine:

image

But what if I wanted to add authentication to my API? Since it’s hosted in Azure as an app service, I can just go there and enable authentication:

image

I can, then, get everything set up through the express option:

image

Save the changes, and it’s done!

Sorry, just joking – not really done yet.

The connector needs to be updated now, since, so far, it does not know that authentication is required now.

In order to update the connector, I need to configure the application first. The application will be there under “app registrations” in the Azure Portal – here is how it looks like in my case:

image

There is, also, a secret:

image

With all that in place, it’s time to update connector settings.

First, let’s make it https:

image

Here is how connector security settings look like:

image

Application ID (client ID) from the app registration page in Azure Portal goes to the Client ID field. Secret key goes to the Client secret field. Login URL and Tenant ID are just the way they are.

Resource URL is set to the same value as Client ID.

Then there is that last parameter which must be copied and added to the redirect urls for my app registration in Azure Portal:

image

Now it’s actually done. Once the connector has been updated and a new “authenticated” connection is created, I can retest the connector:

image

It works… so I just need to update my Flow above (it will require a new connection this time), and retest the flow.

It may seem as if it was quite a bit more involving than Azure Functions or CDS custom actions. But it’s probably just a matter of perception, since, come to think of it, it’s my first custom connector, and I had to figure out some of those things as I kept going.

More to follow on this topic, but enough for now. Have fun!

FetchXml powers turned out to be limited, and I’ve just discovered it the hard way

 

image

That’s just how many linked entities you can have in FetchXml.
I guess I have never needed more than this. That’s until today, of course:

image

Actually, this is not how I discovered it. I was writing an SSRS report and the number of linked entities in my FetchXml query kept growing, so at some point the reports has stopped working:

image

That error message made me try my Fetch in the XrmToolBox, which lead to the error above, which, in turn, made me look at the documentation again… and there it is:

https://docs.microsoft.com/en-us/powerapps/developer/common-data-service/use-fetchxml-construct-query

image

It seems the limitation has been there forever, but it’s only been added to the docs recently:

image

So I’ll probably have to make the report work somehow else. Might have to start using subreports instead of bringing all the data through fetch…

 

 

 

 

Early transition to the UCI – possibly a false alarm yet

We all know that by October 2020 classic web client will be retiring, and UCI interface will take over everywhere where the classic web client might still be reigning at the moment of writing this post.

This can be a very sensitive topic, though, and it can be quite confusing, too. As mentioned in this post, it seems Microsoft is now scheduling the transition for early 2020, and, quite frankly, that may scare the hell out of anybody in the community.

So, I just wanted to clarify something. From what I understand, this early transition is not the same as getting rid of the classic solution designer or settings area. There is a bunch of environments I work with which have already been transitioned:

image

This screenshot is coming directly from the runone portal (https://runone.powerappsportals.com/ ) where you can review the environments and schedule/postpone the updates.

I can still do all my administrative tasks and solution configuration in the classic interface in that transitioned environment:

image

What I can’t do – I can’t work with the actual applications in the classic interface in those environments anymore.

In other words, what this change will bring over is “UCI for the end users”, but not yet “UCI for the admins”. Mind you it’s not necessarily making this easy for the end users, but we have all been warned a while ago, and the clock is definitely ticking very loud now, but, at least, I don’t think we should be concerned about losing the ability to use classic solution designer or to create/update classic workflows with this early transition in 2020 (which might be in preparation for the eventual “full” transition later in the year)

Working with HTML tables in Power Automate Flows

While playing with “HTML tables” earlier today, I suddenly realized that there seem to be a bit more depth to it than I expected.

Let’s say you have this Flow:

image

And imagine that you wanted to

  • Add headers with spaces
  • Change overall look and feel of the rendered table

Turns out those things are not that straightforward.


But, to start with, when I first tried using an HTML Table today, I found that I’m having troubles adding dynamic values. Instead of the regular dynamics values screen with the searchbox and two tabs (“dynamics content” / “expression”):

image

I saw this:

image

Have you ever seen it like that? If you have not, and if you see it, don’t panic. It’s not a new feature!

Just try scaling your browser window down. Even if you currently at 100%, scale down to 90. Definitely scale down to 100 if you are at 130. Once you do it, you’ll probably see the usual dynamics content window:

image


Let’s get back to the HTML Table, though.

Using dynamic content there is as straightforward as anywhere else in the Flows, but how do we add colors, border, modify text size, etc?

For example, if I add HTML font tag to the value:

image

That’s only going to mess up the output since those tags will be added there as “text” rather than as html:

image

So, there is an awesome post which explains the basic concept behind HTML Table formatting options:

https://www.sharepointsiren.com/2019/07/formatting-html-tables-in-flow/

Essentially, we need to get the output and post-process it. I can easily get the output by adding a Compose action:

image

You can take that source and use TryIt to see how it looks like:

https://www.w3schools.com/html/tryit.asp?filename=tryhtml_intro

image

What if, instead of messing with that HTML, we just styled that table using an external stylesheet? To make it look more like this:

image

Of course if you wanted to play with CSS, you might probably make it look even better. How do we add that CSS to the output of the HTML Table action, though?

First, get that CSS file uploaded on some web server which would be available from wherever the table will eventually be viewed. Maybe to Azure somewhere(for instance: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-static-website)

In my case, I am using “styledtable” class name, so I’ll just need to add that class name to the table tag, and I’ll also need to add “link” tag to the output to link my css file to the table. Here is a compose action to do it:

image

And here is that replace function (you’ll definitely need to adjust it for your needs):

replace(body(‘Create_HTML_table’),'<table>’,'<link rel=”stylesheet” href=”https://itaintboring.com/downloads/tablestyle.css”><table class=”styledtable”>’)

All that’s left is to test the result, so let’s add the output to an email:

image

And have fun with the results:

image

And one last note… normally, you are not allowed to add spaces to the header fields. But, of course, you can always use a Compose action to compose something with spaces, and, then, use that action’s output for the header fields:

image

There we go:

image

Taking a closer look at how Flows and Apps communicate with connectors

I used to think that connectors would be isolated from my local machine in the sense that, since they are in the cloud, my machine would be talking to the Flow/Canvas Apps/Flow Designer/etc, but not to the connector directly. Basically, like this:

image

And I was going to mention it in the context of security in one of the following posts. But it turned out there is an interesting scenario where connectors do behave differently depending on whether we are working with them in the “designer” or whether our flows are responding to various triggers.

Earlier today, I got the following error when trying to choose environment for the CDS connector in the Flow:

image

So I got on the call with Microsoft Support just to find out that everything was working. How come?

Well, I was using a laptop which was connected to a different network. You can probably see where it’s going now.

Back to the machine where it was not working, and, in the network tab of Chrome dev tools I see that the following request is failing:

image

That’s the evidence that there is some communication with the connectors which may be happening from the “client side”. In other words, the communication diagram should look a little different:

image

In practical terms, that means one should always read the manuals rather than assuming too muchSmile For this particular issue, there is a whole section in the documentation related to the IP address configuration:

https://docs.microsoft.com/en-us/power-automate/limits-and-config#ip-address-configuration

And the one which we ran into is mentioned there, too. It seems to be one of a few for which I would not be able to explain the purpose right away (would not even recognize them):

image

But, if you look at where the error happened on the screenshots above, you’ll see how having a connectivity issue between your client machine and that domain could hurt you.

Now, in my case there was a problem with DNS resolution. I fixed it temporarily by adding required ip address to the hosts file:

52.242.36.40 canada-001.azure-apim.net

Which also allowed me to do an interesting test. What if, after fixing the connections, I saved the flow and removed that IP address from the hosts file?

The Flow just ran. Without any issues.

Even though, when I tried editing the flow, I could not load the environments again.

Which kind of makes sense, but also gives a clue about what that azure-apim.net is for. Flows will be running on the cloud servers, so they won’t have a problem connecting to the azure-apim.net from there. However, when editing Flows in the designer, the designer will need to work with those connectors, too. Turns out there is a special server(s), which is hosting “connectors runtime”, and which needs to be accessible form our local machines to let the Flow designer communicate with the connectors.  It’s not CDS-specific, it’s not connector-specific… For instance, just out of curiosity, I tried Outlook connector and got an error on the same URL:

image

This is not all, though. If you open network tab for a canvas application, you’ll actually see that Canvas Apps are communicating to the apim servers even in the “play” mode, so, essentially, there is no way around this. We just need to make sure apim servers are accessible from our network.

Power Automate Strikes Again

 

I will start this post with exactly the same picture as the previous one to keep us all alert. Remember, Logic Apps have Inline Scripts now – there is no time to relax till we find an appropriate answer to this challenge:

And, even  though I feel much better now keeping in mind that Azure Functions have proved to work quite well with the Flow yesterday (and, subsequently, with the Canvas Apps), there is one minor issue I did not mention since I did not want to undermine what was done.

imageSee, Azure Functions are not solution-aware. They are not PowerPatform aware for that matter, so you might find it complicated to distribute Azure Functions with the solutions, especially if you are an ISV.

Any options? Is it, actually, a big issue?

Hm… to start with, Logic Apps are not solution-aware either. So, technically, PowerAutomate is already winning from the get go, since PowerAutomate Flows and Canvas Apps are solution aware. But, still, it would be nice to have an option of putting everything into a solution, sending it to the client, and living happily ever after.

This is why this post is going be about the Custom Actions!

“Heh…” – would say a seasoned Dynamics warrior… – “ I knew you would mention it”.

And rightly so (and, btw, if you are one, please feel free to take a seat back there for a while till we get to the Flows down below… Then come forward to read through the rest of this post). But, for those coming more from the Canvas App / Power Automate world, let’s start from the beginning.

So, what is an action?

“Actions and functions represent re-usable operations you can perform using the Web API”

https://docs.microsoft.com/en-us/powerapps/developer/common-data-service/webapi/use-web-api-actions

This is all in the context of CDS. There is  Web API, there are actions, and we can reuse those actions through Web API (yes, through the SDK, too).

What is a CUSTOM action?

There are re-usable operations available out of the box, but we can create our own actions in CDS. Not surprisingly, those care called custom actions.

What does it have to do with adding custom code to PowerAutomate?

First, we can write code with custom actions. More details below.

And, second, there is a relatively new Common Data Service (Current Environment) connector which we can use to easily call custom actions from Flows:

image

Put those two together and you have custom code in PowerAutomate. Let’s do just that.


Logic apps should really start feeling the heat now. This connector is only available in Flows and PowerApps!

image

https://docs.microsoft.com/en-us/connectors/commondataserviceforapps/

Not that I have anything against Logic Apps, but we ought to have something unique on the PowerPlatform side, right?


Anyway, it’s time to create a custom action for the same regex example I used in the previous post.

1. Let’s switch to the classic solution designer (custom actions don’t seem to be available in the new designer yet)

image

2. Let’s create a new action (as a process)

For this one, there is no need to bound it to an entity

image

3. This action will have two input and one output parameter

image

4. No need for any workflow steps – we’ll have C# code instead

So, can simply activate the action

image

5. Now on to the C# code

Basically, we need a plugin. That plugin will kick in whenever an action is called (well, once there is a plugin, we’ll need to register it so it all gets connected)

Plugin development can be a rather long story which I tried to cover in my now-2-years-old-now course:

http://itaintboring.com/downloads/training/Plugin%20Development.pdf

Again, you will find complete code on github, but here is the essence of that plugin:

image

It’s almost a reincarnation of the Azure Function discussed in the previous post with some added plumbing to turn this code into a plugin.

6. We need to register the plugin

The process of registration involves using what is called a plugin registration tool:

image

You may need to download it first:

https://docs.microsoft.com/en-us/dynamics365/customerengagement/on-premises/developer/download-tools-nuget

7. We need to link that plugin to the custom action created before

This involves creating a step in the plugin registration tool. Notice the message name – it’s, actually, the name of the custom action:

image

This took a while – about half an hour, but we have it all ready and can go back to the Flow now.

(Apparently, I cheated here since I had all those tools installed already and I knew what I was doingSmile May take somewhat longer if you are doing it the first time, but you know where to find me if you have questions)

8. Creating the Flow (I am hoping seasoned Dynamics developers are back, btw)

The Flow is very similar to what it used to be for the Azure Function example. The only difference is that instead of calling an Azure Function through the HTTP connector, we will need to call an unbound action through a Common Data Service (Current Environment) connector:

image

9. Calling that Flow from the Canvas App

image

10. And here is the end result (which is no different from what we had before with Azure Functions)

powerapp_to_customaction

 

So, then, compared to the Azure Functions, what is the difference?

  • Custom Actions are CDS-based
  • Azure Functions do not need a CDS
  • Custom action calls will be counted against CDS request limits
  • Azure Function calls will be priced according to Azure Functions pricing
  • Custom actions can be added to the solutions (you will have to add the plugin, the action, and the step)
  • Azure Functions will have to be deployed somehow, but they don’t need CDS
  • Finally, since custom actions will be using plugins for the “code” part, it will be easy to connect to CDS from that plugin
  • Azure Functions, if the need to connect to CDS, will have to create required connections, authenticate, etc

 

In other words… if you have a CDS instance there, and if you need to simplify devops, custom actions might work better. Except that there is that API request limits part which might make you think twice.

This is not over yet, though. It may take a few days before another post on this topic, but I’m hoping to take a closer look at the custom connectors soon.

Adding real code to the low-code

 

Just look at this – it’s a screenshot from Logic Apps:

image

To say that I felt bad when I saw this is to say nothing! I was pretty much devastated.

Do you know that Logic Apps have an Inline Script component that a logic app designer can use to run Javascript?

crying-clipart-gif-animation-464814-2904560And we, Power Platform folks, don’t have it in Power Automate!

You are not sure what I’m talking about? Well, I warn you, you may lose your sleep once you open the link below. But here you go:

https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-add-run-inline-code

HOW COULD THIS BE ADDED TO THE LOW-CODE PLATFORM? Apparently, end of the world is nearing…

Or, possibly, I am just being jealous. Can we have that in Power Automate? Please?

Anyway, fooling aside, I figured this might be a good reason to explore what options we have in PowerPlatform when we need to add real code to such low-code solutions as Power Automate and/or Canvas Apps.

Since, of course, writing formulas for Canvas Apps cannot be considered real coding. It’s just writing (sometimes very complex) formulas.

Off the top of my head, it seems there are a few options:

  • Azure Functions
  • CDS Custom Actions
  • Custom Connectors(?)
  • Calling a logic app which can utilize the inline script mentioned above(?)

Let’s try something out to compare those options. To be more specific, let’s try implementing that regex above?

To start with, we can use Match function in Canvas Apps to achieve most of that, but, for the sake of experiment, let’s imagine we still wanted to call out some code instead, even from the Canvas Apps.

Anyway, Azure Functions were first on my list, and let’s do it all in the same order.

1. Azure Functions

Assuming you have Visual Studio and Azure Subscription, setting up and Azure Function is easy

  • You need to create an Azure Function project in the Visual Studio
  • And you need to deploy it to Azure

image


As for the price, it’s seems to be very affordable for what we are trying to use functions for. Most likely it’ll be free since you’ll probably hit some limits on the Flow side long before you reach 1000000 function executions, but, on the other hand, it all depends on the number of Flows utilizing that function. Still, here are some numbers from the pricing calculator:image

 


 

It took me about an hour(not that long considering that the last time I wrote an Azure Function was a year or so ago), but a quick test in PostMan is showing that I can, now, start using that function to extract all emails from the given string (or, if I want, to find all substrings matching some other regex):

image

 

You will find the sources here:

https://github.com/ashlega/ItAintBoring.PowerPlatformWithCode

But, just quickly, here is the code:

        [FunctionName("RegEx")]
        public static async Task Run([HttpTrigger(AuthorizationLevel.Function, "post", Route = null)]HttpRequestMessage req, TraceWriter log)
        {
            RegExPostData data = await req.Content.ReadAsAsync();

            string result = "";
           
            MatchCollection foundMatches = Regex.Matches(data.value, data.pattern, RegexOptions.IgnoreCase);
            foreach (Match m in foundMatches)
            {
                if (result != "") result += ";";
                result += m.Value;
            }
            return req.CreateResponse(HttpStatusCode.OK, result);
        }

 

Btw, why am I returning a string, not a json? This is because, later on, I’ll need to pass the response back to a Canvas App, and, when doing it from a Flow, it seems I don’t have “json” option. All I have is this:

image

Hence, string it is.

Now, how do I call this function from a Flow/CanvasApp?

Technically, the simplest solution, at least for now, might be to figure out how to call it from a Flow, since we can call a Flow from the Canvas Apps. Which kind of solves the problem for both, so let’s do it that way first.

Here is the Flow:

image

 

When creating the Flow for a Canvas App, I had to use PowerApps trigger:

image

And I added parameters for the Flow using that magical “Ask in PowerApps” option:

image

So I just initialized variables (might not have to create variables, really).

Which I used in the HTTP action to call my Azure Function:

image

And the result of that call went back to the Canvas App through the respond to a PowerApp or Flow action:

image

And what about the Canvas App? It’s very straightforward there:

image

In the OnSelect of the button, I am calling my Flow, which is calling an Azure Function, which is using a regex to find matches, and, then, the result goes back to the Flow, then to the Canvas App, then it gets assigned to the FoundMatches variable… Which is displayed in the textbox:

powerapp_to_azurefunc

One immediate observation – calling an Azure Function this way is not the same as just calling code. There is a delay, of course, because of all the communication etc. Other than that, it was not that complicated at all to make an Azure Function work with a Flow, and, through that flow, with a Canvas App.

And, of course, if I wanted that Azure Function to connect to CDS and do something in CDS, it would have to be a bit more complicated Azure Function. But there might be a better option for that scenario, which is creating a Custom Action and using a CDS (current environment) connector to call that custom action.

That’s for the next post, though: Power Automate Strikes Again

Do you want to become an MVP?

I was watching the video Mark Smith just posted, and, as it often happens, got mixed impression.

Of course I do agree that there is always this idea that becoming an MVP should bring in some benefits. When thinking of “becoming an MVP”, you are inevitably starting to think of those benefits at some point. As in “is it worth putting in all those efforts to be awarded”?

In that sense, Mark has done a great job explaining the benefits.

However, I wanted to give you a little different perspective.

First of all, let’s be honest, I am not a great speakerSmile My main contribution to the community has always been this blog and Linkedin. There were a few tools, there were occasional presentations, and there were forum answers at some point. But all those videos and conferences… I’m just not wired that way.

Somehow, I still got awarded twice so far. What did it really give me?

Consider the NDA access. I do appreciate it since, that way, I often hear about upcoming changes and can provide early feedback before the changes become public. However, I can rarely act on that information since I can’t reference it. In other words, if I knew of an upcoming licensing change (it’s only an example, please don’t start pulling your hair), I could only influence license purchase decisions indirectly. On the technical side, it could be more helpful. But, again, how do you explain technical or architecture decisions which are made based on the NDA information?

Do I appreciate NDA access, though? Absolutely. Even if, more often than not, I can’t use it to justify my decisions, it gives me the feeling that I can influence product group direction. How about that? Out of a sudden, I am not just a “client” utilizing the product – I am a bit of a stakeholder who can influence where the product goes.

What about the money? In my “independent consultant” world, I know a lot of people who are making more even though they are not MVP-s. Maybe it’s different for the full-time employees, but I can’t say much about it.

Speaking engagements. Personally, I am not looking for them that actively. On a practical note, though, I think those engagements are tied to the previous point, which was “money”. More speaking engagement and more publicity means better recognition, and, in the end, more opportunities to land better jobs/contracts. On the other hand, that’s travel, that’s spending time away, etc.

How about free software and tools? I have MSDN and Azure credits. I have Camtasia. Etc. That does help. The tricky part there is… what if I don’t get renewed next year? I will lose all that. But, then, to what extent can I rely on those benefits when preparing my sample solutions, tools, posts, presentations, etc? The way I personally deal with this, I am trying to use this kind of benefits, of course, but I am trying not to over rely on them. For example, rather than getting an MVP instance of Dynamics 365, I’m getting one through the Microsoft Action Pack subscription.  Am I using MSDN? Of course. If I lose it, I’ll deal with it when the time comesSmile

So, in general, I think my overall idea of the MVP program has not changed much in the last year:

https://www.itaintboring.com/mvp/who-are-those-mvp-folks/

However, again on a practical note, what if, after doing all your research, you still wanted to be an MVP? My personal recipe is relatively simple:

  • Find your own motivation for making those community contributions. As for me… I always thought that I can learn more through sharing. No, I am not always sharing just because I want to shareSmile I am sharing because, while doing so, I can fine-tune my own skills and knowledge. After all, how do you write about something if you don’t understand it? The same goes for different tools – it’s one thing to have something developed for myself, but it’s another thing to have a tool that somebody else can use. In the same manner, how do you answer a forum question if you don’t know the answer? You’ll just have to figure out that answer first.
  • Once your motivation and efforts are aligned with the MVP program, and assuming you’ve been doing whatever it is you’ve been doing for some time, you will be awarded. Yes, you may have to get in touch with other MVP-s just to become nominated, but, more likely than not, by the time you do it(and assuming you’ve been making quality contributions), you will already be on the radar, so the question of being nominated won’t be a question at all.

Of course, this recipe does not guarantee the award, since there is no formula to calculate the value of your contributions ahead of time. Well, you may just have to start doing more of those, and then, a little more again. And you’ll get there.

CDS (current environment) connector is playing hide and seek?

Have you ever seen a connector playing hide and seek? Just look at the recording below:

  • Common Data Service (Current Environment Connector) does not show up when I type “related records” on the first screen
  • But it does show up when I do it on the other screen

currentenvconnector

What’s the difference?

From what I could see so far, the only difference is that, in the first case, my Flow is created outside of the solution. In the second case, I’m creating a Flow within a solution.

But, that magic aside, if you have not seen that connector yet, it’s definitely worth looking at since we are getting additional functionality there:

  • FetchXML
  • Relate/Unrelate records
  • Actions
  • There is a unified on update/create/delete trigger

 

And, also, this connector can be deployed through your solutions without having to update the connections.

Actually, it’s a bit more complicated. If you deploy the Flow with such a connector through a managed solution, it will start working in the new environment.

BUT. If you choose to look at that flow in the new environment, you’ll notice that the connections are broken, so you won’t be able to change anything in the Flow until you fix the connections.

Here, notice how the Flow ran a few times:

image

But the connections, if you decide to look at them, are broken:

image

The trick with this connector is not to touch those connectionsSmile Just leave them be, deploy that flow through a solution, and, probably, don’t forget to delete the solution when deploying an updated version of the Flow (for more details on this, have a look at the previous post )

It’s the balloons time (when the help panes are feeling empty)

 

I just learned how to create balloons!

helppanes

At first, it actually did not look that interesting at all when I clicked the question mark button:

image

 

image

Yep, it felt a little empty there. So, I started to wonder, what can I do to make it more interesting?

Turned out there is quite a bit we can do:

https://docs.microsoft.com/en-us/powerapps/maker/common-data-service/create-custom-help-pages

We can add sections, images, bullet lists, videos, some other things… and, of course, those balloons.

The thing about the balloons, though, is that they are linked to the page elements, so, if the elements change (or if you navigate to a different page), the balloons might stop flying. Well, that’s just a note – other than that the balloons still are awesome.

So, what is it we may want to keep in mind about the help panes?

We can add them to the solutions. Although, only while in the classic(?) mode

image

 

We can work with the help XML using the definition below. Although, I guess that involves extracting solution files, updating them, then packing them back into a solution file (would be a good use for the solution packager)

https://docs.microsoft.com/en-us/powerapps/maker/common-data-service/create-custom-help-pages#custom-help-xml-definition

The help pane will stay open as the user keeps navigating in the app

This may bring the help pane a little bit out of content, so the users would have to remember to either close it or to click “home” button at the top left corner to change context for the help pane.

Help panes are language-specific

I just switched to French, and the help pane is feeling empty again

image

I used Dynamics 365 environment everywhere above, but it’s actually working in the CDS environments, too

image

 

Well, it seems to be a pretty cool feature. Of course help authoring may take time, and keeping it up to date may take time, too. But it seems to be a nice easy-to-use feature which can help even if we choose to only use it sporadically where it’s really needed.