Monthly Archives: November 2019

Working with HTML tables in Power Automate Flows

While playing with “HTML tables” earlier today, I suddenly realized that there seem to be a bit more depth to it than I expected.

Let’s say you have this Flow:

image

And imagine that you wanted to

  • Add headers with spaces
  • Change overall look and feel of the rendered table

Turns out those things are not that straightforward.


But, to start with, when I first tried using an HTML Table today, I found that I’m having troubles adding dynamic values. Instead of the regular dynamics values screen with the searchbox and two tabs (“dynamics content” / “expression”):

image

I saw this:

image

Have you ever seen it like that? If you have not, and if you see it, don’t panic. It’s not a new feature!

Just try scaling your browser window down. Even if you currently at 100%, scale down to 90. Definitely scale down to 100 if you are at 130. Once you do it, you’ll probably see the usual dynamics content window:

image


Let’s get back to the HTML Table, though.

Using dynamic content there is as straightforward as anywhere else in the Flows, but how do we add colors, border, modify text size, etc?

For example, if I add HTML font tag to the value:

image

That’s only going to mess up the output since those tags will be added there as “text” rather than as html:

image

So, there is an awesome post which explains the basic concept behind HTML Table formatting options:

https://www.sharepointsiren.com/2019/07/formatting-html-tables-in-flow/

Essentially, we need to get the output and post-process it. I can easily get the output by adding a Compose action:

image

You can take that source and use TryIt to see how it looks like:

https://www.w3schools.com/html/tryit.asp?filename=tryhtml_intro

image

What if, instead of messing with that HTML, we just styled that table using an external stylesheet? To make it look more like this:

image

Of course if you wanted to play with CSS, you might probably make it look even better. How do we add that CSS to the output of the HTML Table action, though?

First, get that CSS file uploaded on some web server which would be available from wherever the table will eventually be viewed. Maybe to Azure somewhere(for instance: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-static-website)

In my case, I am using “styledtable” class name, so I’ll just need to add that class name to the table tag, and I’ll also need to add “link” tag to the output to link my css file to the table. Here is a compose action to do it:

image

And here is that replace function (you’ll definitely need to adjust it for your needs):

replace(body(‘Create_HTML_table’),'<table>’,'<link rel=”stylesheet” href=”https://itaintboring.com/downloads/tablestyle.css”><table class=”styledtable”>’)

All that’s left is to test the result, so let’s add the output to an email:

image

And have fun with the results:

image

And one last note… normally, you are not allowed to add spaces to the header fields. But, of course, you can always use a Compose action to compose something with spaces, and, then, use that action’s output for the header fields:

image

There we go:

image

Taking a closer look at how Flows and Apps communicate with connectors

I used to think that connectors would be isolated from my local machine in the sense that, since they are in the cloud, my machine would be talking to the Flow/Canvas Apps/Flow Designer/etc, but not to the connector directly. Basically, like this:

image

And I was going to mention it in the context of security in one of the following posts. But it turned out there is an interesting scenario where connectors do behave differently depending on whether we are working with them in the “designer” or whether our flows are responding to various triggers.

Earlier today, I got the following error when trying to choose environment for the CDS connector in the Flow:

image

So I got on the call with Microsoft Support just to find out that everything was working. How come?

Well, I was using a laptop which was connected to a different network. You can probably see where it’s going now.

Back to the machine where it was not working, and, in the network tab of Chrome dev tools I see that the following request is failing:

image

That’s the evidence that there is some communication with the connectors which may be happening from the “client side”. In other words, the communication diagram should look a little different:

image

In practical terms, that means one should always read the manuals rather than assuming too muchSmile For this particular issue, there is a whole section in the documentation related to the IP address configuration:

https://docs.microsoft.com/en-us/power-automate/limits-and-config#ip-address-configuration

And the one which we ran into is mentioned there, too. It seems to be one of a few for which I would not be able to explain the purpose right away (would not even recognize them):

image

But, if you look at where the error happened on the screenshots above, you’ll see how having a connectivity issue between your client machine and that domain could hurt you.

Now, in my case there was a problem with DNS resolution. I fixed it temporarily by adding required ip address to the hosts file:

52.242.36.40 canada-001.azure-apim.net

Which also allowed me to do an interesting test. What if, after fixing the connections, I saved the flow and removed that IP address from the hosts file?

The Flow just ran. Without any issues.

Even though, when I tried editing the flow, I could not load the environments again.

Which kind of makes sense, but also gives a clue about what that azure-apim.net is for. Flows will be running on the cloud servers, so they won’t have a problem connecting to the azure-apim.net from there. However, when editing Flows in the designer, the designer will need to work with those connectors, too. Turns out there is a special server(s), which is hosting “connectors runtime”, and which needs to be accessible form our local machines to let the Flow designer communicate with the connectors.  It’s not CDS-specific, it’s not connector-specific… For instance, just out of curiosity, I tried Outlook connector and got an error on the same URL:

image

This is not all, though. If you open network tab for a canvas application, you’ll actually see that Canvas Apps are communicating to the apim servers even in the “play” mode, so, essentially, there is no way around this. We just need to make sure apim servers are accessible from our network.

Power Automate Strikes Again

 

I will start this post with exactly the same picture as the previous one to keep us all alert. Remember, Logic Apps have Inline Scripts now – there is no time to relax till we find an appropriate answer to this challenge:

And, even  though I feel much better now keeping in mind that Azure Functions have proved to work quite well with the Flow yesterday (and, subsequently, with the Canvas Apps), there is one minor issue I did not mention since I did not want to undermine what was done.

imageSee, Azure Functions are not solution-aware. They are not PowerPatform aware for that matter, so you might find it complicated to distribute Azure Functions with the solutions, especially if you are an ISV.

Any options? Is it, actually, a big issue?

Hm… to start with, Logic Apps are not solution-aware either. So, technically, PowerAutomate is already winning from the get go, since PowerAutomate Flows and Canvas Apps are solution aware. But, still, it would be nice to have an option of putting everything into a solution, sending it to the client, and living happily ever after.

This is why this post is going be about the Custom Actions!

“Heh…” – would say a seasoned Dynamics warrior… – “ I knew you would mention it”.

And rightly so (and, btw, if you are one, please feel free to take a seat back there for a while till we get to the Flows down below… Then come forward to read through the rest of this post). But, for those coming more from the Canvas App / Power Automate world, let’s start from the beginning.

So, what is an action?

“Actions and functions represent re-usable operations you can perform using the Web API”

https://docs.microsoft.com/en-us/powerapps/developer/common-data-service/webapi/use-web-api-actions

This is all in the context of CDS. There is  Web API, there are actions, and we can reuse those actions through Web API (yes, through the SDK, too).

What is a CUSTOM action?

There are re-usable operations available out of the box, but we can create our own actions in CDS. Not surprisingly, those care called custom actions.

What does it have to do with adding custom code to PowerAutomate?

First, we can write code with custom actions. More details below.

And, second, there is a relatively new Common Data Service (Current Environment) connector which we can use to easily call custom actions from Flows:

image

Put those two together and you have custom code in PowerAutomate. Let’s do just that.


Logic apps should really start feeling the heat now. This connector is only available in Flows and PowerApps!

image

https://docs.microsoft.com/en-us/connectors/commondataserviceforapps/

Not that I have anything against Logic Apps, but we ought to have something unique on the PowerPlatform side, right?


Anyway, it’s time to create a custom action for the same regex example I used in the previous post.

1. Let’s switch to the classic solution designer (custom actions don’t seem to be available in the new designer yet)

image

2. Let’s create a new action (as a process)

For this one, there is no need to bound it to an entity

image

3. This action will have two input and one output parameter

image

4. No need for any workflow steps – we’ll have C# code instead

So, can simply activate the action

image

5. Now on to the C# code

Basically, we need a plugin. That plugin will kick in whenever an action is called (well, once there is a plugin, we’ll need to register it so it all gets connected)

Plugin development can be a rather long story which I tried to cover in my now-2-years-old-now course:

http://itaintboring.com/downloads/training/Plugin%20Development.pdf

Again, you will find complete code on github, but here is the essence of that plugin:

image

It’s almost a reincarnation of the Azure Function discussed in the previous post with some added plumbing to turn this code into a plugin.

6. We need to register the plugin

The process of registration involves using what is called a plugin registration tool:

image

You may need to download it first:

https://docs.microsoft.com/en-us/dynamics365/customerengagement/on-premises/developer/download-tools-nuget

7. We need to link that plugin to the custom action created before

This involves creating a step in the plugin registration tool. Notice the message name – it’s, actually, the name of the custom action:

image

This took a while – about half an hour, but we have it all ready and can go back to the Flow now.

(Apparently, I cheated here since I had all those tools installed already and I knew what I was doingSmile May take somewhat longer if you are doing it the first time, but you know where to find me if you have questions)

8. Creating the Flow (I am hoping seasoned Dynamics developers are back, btw)

The Flow is very similar to what it used to be for the Azure Function example. The only difference is that instead of calling an Azure Function through the HTTP connector, we will need to call an unbound action through a Common Data Service (Current Environment) connector:

image

9. Calling that Flow from the Canvas App

image

10. And here is the end result (which is no different from what we had before with Azure Functions)

powerapp_to_customaction

 

So, then, compared to the Azure Functions, what is the difference?

  • Custom Actions are CDS-based
  • Azure Functions do not need a CDS
  • Custom action calls will be counted against CDS request limits
  • Azure Function calls will be priced according to Azure Functions pricing
  • Custom actions can be added to the solutions (you will have to add the plugin, the action, and the step)
  • Azure Functions will have to be deployed somehow, but they don’t need CDS
  • Finally, since custom actions will be using plugins for the “code” part, it will be easy to connect to CDS from that plugin
  • Azure Functions, if the need to connect to CDS, will have to create required connections, authenticate, etc

 

In other words… if you have a CDS instance there, and if you need to simplify devops, custom actions might work better. Except that there is that API request limits part which might make you think twice.

This is not over yet, though. It may take a few days before another post on this topic, but I’m hoping to take a closer look at the custom connectors soon.

Adding real code to the low-code

 

Just look at this – it’s a screenshot from Logic Apps:

image

To say that I felt bad when I saw this is to say nothing! I was pretty much devastated.

Do you know that Logic Apps have an Inline Script component that a logic app designer can use to run Javascript?

crying-clipart-gif-animation-464814-2904560And we, Power Platform folks, don’t have it in Power Automate!

You are not sure what I’m talking about? Well, I warn you, you may lose your sleep once you open the link below. But here you go:

https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-add-run-inline-code

HOW COULD THIS BE ADDED TO THE LOW-CODE PLATFORM? Apparently, end of the world is nearing…

Or, possibly, I am just being jealous. Can we have that in Power Automate? Please?

Anyway, fooling aside, I figured this might be a good reason to explore what options we have in PowerPlatform when we need to add real code to such low-code solutions as Power Automate and/or Canvas Apps.

Since, of course, writing formulas for Canvas Apps cannot be considered real coding. It’s just writing (sometimes very complex) formulas.

Off the top of my head, it seems there are a few options:

  • Azure Functions
  • CDS Custom Actions
  • Custom Connectors(?)
  • Calling a logic app which can utilize the inline script mentioned above(?)

Let’s try something out to compare those options. To be more specific, let’s try implementing that regex above?

To start with, we can use Match function in Canvas Apps to achieve most of that, but, for the sake of experiment, let’s imagine we still wanted to call out some code instead, even from the Canvas Apps.

Anyway, Azure Functions were first on my list, and let’s do it all in the same order.

1. Azure Functions

Assuming you have Visual Studio and Azure Subscription, setting up and Azure Function is easy

  • You need to create an Azure Function project in the Visual Studio
  • And you need to deploy it to Azure

image


As for the price, it’s seems to be very affordable for what we are trying to use functions for. Most likely it’ll be free since you’ll probably hit some limits on the Flow side long before you reach 1000000 function executions, but, on the other hand, it all depends on the number of Flows utilizing that function. Still, here are some numbers from the pricing calculator:image

 


 

It took me about an hour(not that long considering that the last time I wrote an Azure Function was a year or so ago), but a quick test in PostMan is showing that I can, now, start using that function to extract all emails from the given string (or, if I want, to find all substrings matching some other regex):

image

 

You will find the sources here:

https://github.com/ashlega/ItAintBoring.PowerPlatformWithCode

But, just quickly, here is the code:

        [FunctionName("RegEx")]
        public static async Task Run([HttpTrigger(AuthorizationLevel.Function, "post", Route = null)]HttpRequestMessage req, TraceWriter log)
        {
            RegExPostData data = await req.Content.ReadAsAsync();

            string result = "";
           
            MatchCollection foundMatches = Regex.Matches(data.value, data.pattern, RegexOptions.IgnoreCase);
            foreach (Match m in foundMatches)
            {
                if (result != "") result += ";";
                result += m.Value;
            }
            return req.CreateResponse(HttpStatusCode.OK, result);
        }

 

Btw, why am I returning a string, not a json? This is because, later on, I’ll need to pass the response back to a Canvas App, and, when doing it from a Flow, it seems I don’t have “json” option. All I have is this:

image

Hence, string it is.

Now, how do I call this function from a Flow/CanvasApp?

Technically, the simplest solution, at least for now, might be to figure out how to call it from a Flow, since we can call a Flow from the Canvas Apps. Which kind of solves the problem for both, so let’s do it that way first.

Here is the Flow:

image

 

When creating the Flow for a Canvas App, I had to use PowerApps trigger:

image

And I added parameters for the Flow using that magical “Ask in PowerApps” option:

image

So I just initialized variables (might not have to create variables, really).

Which I used in the HTTP action to call my Azure Function:

image

And the result of that call went back to the Canvas App through the respond to a PowerApp or Flow action:

image

And what about the Canvas App? It’s very straightforward there:

image

In the OnSelect of the button, I am calling my Flow, which is calling an Azure Function, which is using a regex to find matches, and, then, the result goes back to the Flow, then to the Canvas App, then it gets assigned to the FoundMatches variable… Which is displayed in the textbox:

powerapp_to_azurefunc

One immediate observation – calling an Azure Function this way is not the same as just calling code. There is a delay, of course, because of all the communication etc. Other than that, it was not that complicated at all to make an Azure Function work with a Flow, and, through that flow, with a Canvas App.

And, of course, if I wanted that Azure Function to connect to CDS and do something in CDS, it would have to be a bit more complicated Azure Function. But there might be a better option for that scenario, which is creating a Custom Action and using a CDS (current environment) connector to call that custom action.

That’s for the next post, though: Power Automate Strikes Again

Do you want to become an MVP?

I was watching the video Mark Smith just posted, and, as it often happens, got mixed impression.

Of course I do agree that there is always this idea that becoming an MVP should bring in some benefits. When thinking of “becoming an MVP”, you are inevitably starting to think of those benefits at some point. As in “is it worth putting in all those efforts to be awarded”?

In that sense, Mark has done a great job explaining the benefits.

However, I wanted to give you a little different perspective.

First of all, let’s be honest, I am not a great speakerSmile My main contribution to the community has always been this blog and Linkedin. There were a few tools, there were occasional presentations, and there were forum answers at some point. But all those videos and conferences… I’m just not wired that way.

Somehow, I still got awarded twice so far. What did it really give me?

Consider the NDA access. I do appreciate it since, that way, I often hear about upcoming changes and can provide early feedback before the changes become public. However, I can rarely act on that information since I can’t reference it. In other words, if I knew of an upcoming licensing change (it’s only an example, please don’t start pulling your hair), I could only influence license purchase decisions indirectly. On the technical side, it could be more helpful. But, again, how do you explain technical or architecture decisions which are made based on the NDA information?

Do I appreciate NDA access, though? Absolutely. Even if, more often than not, I can’t use it to justify my decisions, it gives me the feeling that I can influence product group direction. How about that? Out of a sudden, I am not just a “client” utilizing the product – I am a bit of a stakeholder who can influence where the product goes.

What about the money? In my “independent consultant” world, I know a lot of people who are making more even though they are not MVP-s. Maybe it’s different for the full-time employees, but I can’t say much about it.

Speaking engagements. Personally, I am not looking for them that actively. On a practical note, though, I think those engagements are tied to the previous point, which was “money”. More speaking engagement and more publicity means better recognition, and, in the end, more opportunities to land better jobs/contracts. On the other hand, that’s travel, that’s spending time away, etc.

How about free software and tools? I have MSDN and Azure credits. I have Camtasia. Etc. That does help. The tricky part there is… what if I don’t get renewed next year? I will lose all that. But, then, to what extent can I rely on those benefits when preparing my sample solutions, tools, posts, presentations, etc? The way I personally deal with this, I am trying to use this kind of benefits, of course, but I am trying not to over rely on them. For example, rather than getting an MVP instance of Dynamics 365, I’m getting one through the Microsoft Action Pack subscription.  Am I using MSDN? Of course. If I lose it, I’ll deal with it when the time comesSmile

So, in general, I think my overall idea of the MVP program has not changed much in the last year:

https://www.itaintboring.com/mvp/who-are-those-mvp-folks/

However, again on a practical note, what if, after doing all your research, you still wanted to be an MVP? My personal recipe is relatively simple:

  • Find your own motivation for making those community contributions. As for me… I always thought that I can learn more through sharing. No, I am not always sharing just because I want to shareSmile I am sharing because, while doing so, I can fine-tune my own skills and knowledge. After all, how do you write about something if you don’t understand it? The same goes for different tools – it’s one thing to have something developed for myself, but it’s another thing to have a tool that somebody else can use. In the same manner, how do you answer a forum question if you don’t know the answer? You’ll just have to figure out that answer first.
  • Once your motivation and efforts are aligned with the MVP program, and assuming you’ve been doing whatever it is you’ve been doing for some time, you will be awarded. Yes, you may have to get in touch with other MVP-s just to become nominated, but, more likely than not, by the time you do it(and assuming you’ve been making quality contributions), you will already be on the radar, so the question of being nominated won’t be a question at all.

Of course, this recipe does not guarantee the award, since there is no formula to calculate the value of your contributions ahead of time. Well, you may just have to start doing more of those, and then, a little more again. And you’ll get there.

CDS (current environment) connector is playing hide and seek?

Have you ever seen a connector playing hide and seek? Just look at the recording below:

  • Common Data Service (Current Environment Connector) does not show up when I type “related records” on the first screen
  • But it does show up when I do it on the other screen

currentenvconnector

What’s the difference?

From what I could see so far, the only difference is that, in the first case, my Flow is created outside of the solution. In the second case, I’m creating a Flow within a solution.

But, that magic aside, if you have not seen that connector yet, it’s definitely worth looking at since we are getting additional functionality there:

  • FetchXML
  • Relate/Unrelate records
  • Actions
  • There is a unified on update/create/delete trigger

 

And, also, this connector can be deployed through your solutions without having to update the connections.

Actually, it’s a bit more complicated. If you deploy the Flow with such a connector through a managed solution, it will start working in the new environment.

BUT. If you choose to look at that flow in the new environment, you’ll notice that the connections are broken, so you won’t be able to change anything in the Flow until you fix the connections.

Here, notice how the Flow ran a few times:

image

But the connections, if you decide to look at them, are broken:

image

The trick with this connector is not to touch those connectionsSmile Just leave them be, deploy that flow through a solution, and, probably, don’t forget to delete the solution when deploying an updated version of the Flow (for more details on this, have a look at the previous post )

It’s the balloons time (when the help panes are feeling empty)

 

I just learned how to create balloons!

helppanes

At first, it actually did not look that interesting at all when I clicked the question mark button:

image

 

image

Yep, it felt a little empty there. So, I started to wonder, what can I do to make it more interesting?

Turned out there is quite a bit we can do:

https://docs.microsoft.com/en-us/powerapps/maker/common-data-service/create-custom-help-pages

We can add sections, images, bullet lists, videos, some other things… and, of course, those balloons.

The thing about the balloons, though, is that they are linked to the page elements, so, if the elements change (or if you navigate to a different page), the balloons might stop flying. Well, that’s just a note – other than that the balloons still are awesome.

So, what is it we may want to keep in mind about the help panes?

We can add them to the solutions. Although, only while in the classic(?) mode

image

 

We can work with the help XML using the definition below. Although, I guess that involves extracting solution files, updating them, then packing them back into a solution file (would be a good use for the solution packager)

https://docs.microsoft.com/en-us/powerapps/maker/common-data-service/create-custom-help-pages#custom-help-xml-definition

The help pane will stay open as the user keeps navigating in the app

This may bring the help pane a little bit out of content, so the users would have to remember to either close it or to click “home” button at the top left corner to change context for the help pane.

Help panes are language-specific

I just switched to French, and the help pane is feeling empty again

image

I used Dynamics 365 environment everywhere above, but it’s actually working in the CDS environments, too

image

 

Well, it seems to be a pretty cool feature. Of course help authoring may take time, and keeping it up to date may take time, too. But it seems to be a nice easy-to-use feature which can help even if we choose to only use it sporadically where it’s really needed.

A tricky Flow

I got a tricky Power Automate Flow the other day – it was boldly refusing to meet my expectations in what it was supposed to do. In retrospect, as much as I would want to say that it was all happening since Power Automate was in a bad mood, there seem to be a couple of things we should keep in mind when creating the Flows, and, somewhat specifically, when using Common Data Service(current environment) connector:

image

That connectors supports FetchXml queries in the List Records action, which makes it very convenient in the situations where you need to query data based on some conditions.

Here is what may happen, though.

Let’s imagine some simple scenario for the Flow:

  • The Flow will start on create of the lead record
  • When a lead is created, the Flow would use “List records” action to select a contact with the specific last name
  • Finally, the flow would send an email to that contact

 

And there will be two environments, so the idea is that we’ll use a managed solution to move this flow from development to production:

image

image

Let’s see if it works? I’ve created a lead, and here is my email notification:

image

But wait, wasn’t it supposed to greet me by name, not just say “Hello”?

Problem is, even though I can use all those attributes in the flow, they have to be added to the FetchXml in order to query them through the List Records action. Since I did not have firstname included in the Fetch, it came up empty.

The fix is simple:

image

And I have my email with the proper name now:

image

Now let’s bring this flow through a managed solution to another environment.

  • Export as managed
  • Import into the prod environment

 

Before I continue, let’s look at the solution layers for that flow in production:

image

Everything is perfect, but now we need to fix the connections for the flow:

  • image
  • Once the connections have been fixed, apparently we need to save the Flow.
  • What happens to the solution layers when we click “save”, though?

 

image

That is, actually, unfortunate. Let’s say I need to update the Flow now.

In the development environment, I can add an extra attribute to the Fetch:

image

That solution is, then, exported with a higher version, and I’m about to bring it over to production:

image

I should see that attribute added in production now, right?

image

You can see it’s not there.

I would guess this problem is related to the solution layering – when updating connections in production, I had to save the flow there, and that created a change in the unmanaged layer. Importing updated managed solution made changes to the managed layer, but, since it was an existing solution/flow, those changes went just under the unmanaged layer, so they did not show up on the “surface”.

If I go to the solution layers for my flow in product and remove active customizations:

image

All connections in the Flow will be broken again, but that additional attribute will finally show up:

image

This is when I can fix the connections, and, finally, get the Flow up and running as expected.

Of course another option might be to remove managed solution completely and to re-import updated version. Since I normally have Flows/Workflows in a separate solution, that would probably work just fine, even if I had to request a downtime window.

 

Bulk-loading inactive records to CDS

 

When implementing ItAintBoring.Deployment powershell modules, I did not initially add support for the “status” and “status reason” fields. Usually, we don’t need to migrate inactive reference data, but there are always exceptions, and I just hit one the other day. Reality check.

There is an updated version of the powershell script now, and there is an updated version of the corresponding nuget package.

But there is a caveat.

In CDS, we cannot create inactive records. We have to create a records as “active” first, and, then, we can deactivate it.

Just to illustrate what happens when you try, here is a screenshot of the Flow where I am trying to create a record using one of the inactive status reasons:

image

The error goes like this:

7 is not a valid status code for state code LeadState.Open on lead with Id d794b380-0501-ea11-a811-000d3af46cc5.

In other words, CDS is trying to use inactive status reason with the active status, and, of course, those are incompatible.

The workaround here would be to create the record first using one of the active status reasons, and, then, to change the status/status reason.

If we get back to the bulk data load through powershell scripts above, then it would look like this:

  • Export data without status/status reason into one file
  • Export data with status/status reasons into another file
  • Import the first file
  • Import the second file

 

In other words, in the export script I would use these two queries(notice how there is no status/status reason in the first one, and the second one is querying all attributes):

image

Once I’ve run the export, here is how exported data looks like – you can see the difference:

image

And, then, I just need to import those files in the same order.

Here is what I had before I ran the import:

image

Here is what I have after:

image

It takes a little bit of planning to bulk-load reference data this way, but, in the end, it’s just an extra run for the script, an extra fetch xml for me, and quite a bit of time saving when automating the deployment.

UI Flow in Power Automate (former Microsoft Flow)

 

If you have not heard about UI Flows, give them a try! As in right now… that’s just some cool stuff from Microsoft which is now in preview.

Login to your powerapps portal (https://make.powerapps.com), select Flows, and choose UI Flows:

image

The coolest part about it is that, right from the start, you can probably appreciate where it’s all going:

image There are no connectors, there is no API, there is nothing BUT recording and re-running of the user actions.

You want to automatically open an application, fill in some fields, click save button, etc? There you go. You want to open a web site, populate some fields, click “next”, etc? That’s another scenario. Basically, we should be able to automate various usage scenarios – I am not sure if this also means we’ll be able to use this for automated testing, and I am also not sure to what extent this will work with various web applications where controls are created/loaded/updated on the fly… But, if it all works out, this is going to be really interesting.

And I am wondering about the licensing, since, technically, it seems there will be no API calls or connectors involved, so might not be a lot of load on the Microsoft servers when running such flows. Does it mean “not that expensive” licensing? We’ll see, I guess.

Anyway, let’s just try something.

Let say I wanted to create a desktop UI flow:

image

Apparently, need to download a browser extension. Presumably, that’s to actually perform actions on my computer (heh… how about security… anyway, that’s for later):

image

image

Here is a funny hiccup – the installer asked me to close all Edge/Chrome windows. Lost the Flow, had to re-open and re-create again.

Continuing from there and still getting the same error.

Some back and forth, tried installing new Edge (Chromium), still the same… Eventually, I tried updating that same Flow using a different portal:

https://flow.microsoft.com

It finally worked that way, and it also started to work through make.powerapps.com after that.

And I have just recorded a UI Flow which is going to put some text into a notepad window!

image

Here, have a look (it takes a few second to start the test):

desktopuiflow

Might seem like not too much  for now, but keep in mind this was just a simple test. At the moment, I am not even sure of the practical applications and/or limitations yet, but that’s for later.