Monthly Archives: January 2021

Sometimes, we still do certain things using on-premise technologies. I just spent half of my day solving a strange problem with SSIS

It’s been a while since I wrote about SSIS – most of what we do these days is happening in the cloud.

But, sometimes, we still need to resort to the old good time-proven technologies… such as SSIS.

What’s the de-facto standard toolset of any Dynamics/PowerPlatform consultant who need to use SSIS? Well, that’s, of course, SSIS itself and Kingsway connector in particular.

If you ever had to do the following:

  • Read rows from the file
  • Adjust Dynamics CRM Source query for each line to query data from one environment
  • So you could push certain changes to another environment


You probably know that Dynamics CRM Source does not accept inputs, so you can’t just use a File Source and connect File Source’s output to the Dynamics CRM Source’s input:


Actually, this does not have anything to do with Kingsway, since other data sources (OleDB, for instance), would have exactly the same behavior.

I don’t know if I’ll ever get a chance to write about it again given that I only need to dig into SSIS every so often these days, so, here is a workaround just for the record.

Instead of doing everything in the same data flow task, you can do this:


In the first task there, read data from the file and push all file lines to an object variable using a script component:


Here is how the script looks like:


So, basically, just push everything to an array list, and pass it back to the package through a variable.

From there, set up a For Each Loop Container to work off your variable:


For every iteration, you’ll need to push each individual line to another variable:


From there, you can now work with each row separately in the loop. This should also be applicable to other data sources – it could even be Dynamics CRM Source originally, though I’m not sure how to pass complex objects through the enumeration in this case.

Anyways, since we now have individual file row in the variable, we can use a Script Task to parse that variable line and populate additional variables, and, from there, we can use those other variables in the other tasks within each iteration to query Dataverse and/or to do whatever else we need there.


  • And here is how the parsing is done in that Script Task:
  • image

And the end result is that I can now use those variables to tweak FetchXml as required for all Dynamics CRM Sources in those other tasks within the Foreach Loop Container.

That seems to be a lot of work for a simple task… and I can’t help but notice that I could probably do this in Power Automate more efficiently. But, well, we don’t always have the choice of technology, so, if you have to do this with SSIS, this might help.

TDS endpoint – getting closer to GA?

Just a few days ago, when I was writing a post about Power BI Paginated Reports, “Dataverse” connection type was still called “Common Data Service(Preview)” there.

As of today, in the updated version of the Power BI Report Builder, it’s called “Dataverse”:


There is no reference to the preview anymore, but, apparently, it’s still a SqlClient connection.

That said, it seems Dataverse SQL Query is still in preview:

Yet there is no mentioning of the Dataverse datasource in the docs on the list of datasources for Power BI Paginated Reports:

So is it official? Is it not official? Probably not quite, yet. Not until we see it documented in the docs. But, that said, it’s clear some work is happening behind the scene, and TDS endpoint-based connectors are starting to make it into the Power Platform (And, intentionally or not, they are not even explicitly defined as preview features anymore).

Power BI Paginated Reports and data security

In the previous post, I looked at some of the features of Power BI paginated reports. What was still unclear is how security works.

Turned out it works pretty much as one would expect. Users having no access to the data won’t see that data in the report.

Unless the report has NOT been configured to use credentials of the viewing user.

But first things first. For those new to Power BI administration, keep in mind you may need to be gives permissions to share reports from the previous workspace, and, for that, you need to be given “Member” role:


Unless you are given a role that allows sharing, you will likely be getting a little ugly error when trying to share. Although, you will still be able to publish a report if you are at least a contributor, and, then, somebody else could share it with the other users.

That’s the first level of “security” – if a user does not have access to the workspace, the report would have to be shared with them.

Once it is shared, you will find that user’s ability to see data in the report depends on how the datasource has been configured.

You can configure report datasource through the “manage” action:




Then just make sure the checkbox above (“Report viewers can only access this data source with their own Power BI identities using DirectQuery”). When that option is selected, Power BI will use report viewer credentials to load data from Dataverse.

If the option is not selected, Power BI will be using default data source credentials (those will usually be of the report author’s).

In other words, if the option is not selected OR if the user has permissions to see data, all data will be visible (#1 on the screenshot below).

If the option is selected AND if the user viewing the report does not have permissions to see certain data, that data will not show up:


So, if you are concerned about the security in paginated reports… it seems you should not be. It’s all been taken care of.

The one remaining piece is better integration with model-driven apps (running reports for the grids, running them from the command bar, etc). That will likely require some scripting, but let me see if I can come up with something in one of the next post.

Power BI Paginated Reports vs SSRS reports

Now that SSRS authoring tools have finally been updated (and there is a great article Nick Doelman published the other day on this topic), it might be the right time to ponder on which technology we should be using moving forward since there are two now:

  • Power BI Paginated Reports
  • SSRS


So, how do Paginated Reports fair against SSRS?

To start with, both are somewhat the same. They are both meant to deliver pixel-perfect reports, and, actually, Power BI Paginated Reports are using the same “rdl” format as SSRS.

Unlike SSRS, Power BI Paginated Reports are supposed to be designed using a tool called Power BI Report Builder. Although, it will probably look surprisingly familiar to anyone who had some experience with SSRS:


SSRS will run on the reporting server, and, as far as model-driven applications go, it’s always been the case that reporting server would have been hidden from us, developers.

However, we can go to the model-driven app and run reports from there. That said, we have to use FetchXML to build those SSRS reports, and that kind of limits us in terms of the capabilities, since FetchXML, even though quite powerful, does not offer the same capabilities as SQL.

And SSRS is “free”. Meaning that it comes with model-driven apps.

That last one can be a big deal if you are thinking of switching to Power BI Paginated Reports, since the first thing you’ll probably realize is that you need a Premium Power BI workspace.

A Premium Workspace will cost you, quite a bit. This is not to say that Power Apps licensing in general is cheap these days, and this is not to say that everybody can develop SSRS reports. Most likely, if you are thinking of getting SSRS reports developed, you’ll need to start paying the developer, and this is not cheap either, so, in the end, if you tried comparing licensing fees only, it might not give you an accurate picture.

So, assuming SSRS has been covered so many times and you probably know what’s doable with SSRS, let me cover some of the Power BI Paginated Reports features.

There is a direct SQL connection (to the TDS endpoint)


This may give numerous advantages – using SQL instead of Fetch is, well, a dream came true. But, since Power BI Paginated Reports are not solution aware, this may mean deployment is not going to be as easy.

Also, so far TDS endpoint is still in preview, so, as much as I’d like to be able to say we can rely on this kind of reporting moving forward, I think we’ll need to wait and see if/when TDS endpoint goes to the GA. Still, if and when it does go to the GA, this comparison will make even more sense.

I mean, it seems to be a no-brainer that writing a SQL query is much easier than creating FetchXML:


That easily leads to the following report:


And we can run it in the Report Builder right away to see the output:


This report can be stored in the Premium Power BI Workspace




And, once it’s saved, we can go to the Power BI web site in the browser and see the report there:


Once there, there is no need for the report builder anymore – report users can run this report directly from the Power BI website:


And there is that “Export” option, btw…

What about the security?

This one I’m not sure about right now. There are two sides there:

  • Datasource credentials
  • Access to the report


Access to the report is controlled through report sharing:


I have to admit I was not able to share my report yet, since there is always an error. Could it be because of the TDS datasource? Possibly, have to track it down yet.

Once I’ve managed to share this, I’ll probably be able to see how it works with the datasource access, too. This may have to wait, thoughSmile

Update (Jan 26): there is a follow up on the security topic in the next post 

And I think here is the kicker, really… Power Automate Integration


It’s more than Power Automate – it’s just that there is Rest API we can use with Power BI. And, of course, where there is Rest API, there can be a Power Automate Connector:


On the list of features which are missing from SSRS (in Model Driven Apps, at least), I’d rate this #1.

Here is that report in my mailbox:


So, there is definitely something to think about here. Will Microsoft keep supporting both technologies? Will they merge somehow? Will SSRS be migrated to Power BI at some point? Will TDS endpoint become GA soon? Will SSRS (for Model-Driven Apps) get a Power Automate connector?

At the moment, I have no definitive answers, but, hopefully, there is some food for thought in this post. So… choose wiselySmile

Office App launcher – we can use it in model-driven apps, too, though there is a caveat

Have you ever tried adding power apps applications to the office app launcher?

I actually had not done it myself up until a few days ago, when one of the users mentioned they were observing some strange behavior there.

So I tried, and it was working perfectly fine for me, until, just by pure coincidence, we’ve noticed what the difference was.

But first things first… Here is an app launcher panel with my app added to it:


It’s easy to add apps there, and there is a short support video here which will walk you through the process if you are not familiar with it yet:

Exactly the same app is current showing up in the app launcher when I’m opening it from the model-driven app:


So that’s, really, a nice quick navigation link everybody can set up for themselves to switch between the application they are using often.

There are a few caveats there, since, even though app launchers on both screenshots above look the same, there are not the same.

This whole post started because one of the users wanted to unpin the app from their app launcher, and they could not. Turned out that option is available in the office app launcher, but not in the model-driven application app launcher:


Essentially, as of today, model-driven application app launcher behaves as a read-only version of the office app launcher in that sense – if you wanted to change something there, you’d need to do it from the office app launcher.

And, on top of that, there are a few caching/synchronization delays which I ran across while trying this out.

Once a business application (canvas/model-driven) is added to the tenant, it might not be available for a while on the “office app launcher” side

This could be really frustrating, since you’d think it would be there, but it just refuses to show up. And this seems to be a caching issue.

Somebody gave me a good advice and suggested that I should try using “sync” option at


That seems to have helped, although, the result is not, always, immediate. It may still take a few hours before new app becomes available to the office app launcher after this.

Once the app is added to the office app launcher, it may still take a few hours before it starts showing up in the model-driven application app launcher

Apparently, there is another cache involved somewhere. Not sure if there is a way to quickly refresh it, but, from I saw, it may take another day before you’ll see that app in the model-driven app launcher.

All that said, it seems to be a nice feature that can help users navigate through different apps quickly. Just don’t panic if they start calling you about somewhat unexpected launcher behavior – it might well be one of the issues above.

User-Local behavior with Date-Only format… what’s that for?

I’ll be absolutely honest – I have no idea what the answer is to the question above. Except that this combination of settings might be a legacy of the older versions of Dynamics CRM?

However, if I were to make a suggestion here, I’d say don’t use that combination. Since it may lead to confusing results.

But first things first –  here is what I’m talking about:


All those settings are described in the docs:

What remains unclear after looking at the docs is what’s going to be the time portion of the date and time field when it’s configured as per the screenshot above?

That’s easy to test. Here is what I have in the model-driven UI:


And here is what I see when querying that record through Web API:


I’m in EST, so 5 hours is what my timezone offset is. Which means the time portion there represents 00:00:00 in my timezone, so Jan 11 is just starting.

The tricky part is that, since the field is timezone-enabled, looking at the same value from another timezone might yield somewhat strange results.

If I were in GMT-6, I’d see that same value as Jan 10:


And, if I were closer to the UTC, I’d see that as Jan 11:


What’s happening there is that a user in any timezone that’s farther from UTC than my timezone will see the date I entered less one day.

Question is: why is it moving in that direction and not in the other, and why is it moving at all? And, then, how do we account for those movements properly when migrating data or even when typing in the date manually so that users in different timezones could, actulally, see what they need to see?

Unless there is a good example of where this helps, the actual answer seems to be: don’t use that combination of settings. Instead, opt for one of the other behaviors:

Date Only behavior will ensure that the date is stored as is on the server.And the time? It’ll be set to 12:00 AM. But, since it’ll be stored and presented as the same date and time in all timezones (12 AM on Jan 11), that’s not going to cause confusion.

Timezone-independent behavior might be another good option here. It’s, actually, not quite clear what’s the difference between “Date Only” behavior and “Timezone independent” with “Date only” format(you might also want to see what Scott Durow had to say on this topic), but both should yield predictable results.

Now this is how it looks in the classic designer. And, in the maker portal, things are somewhat different, since there are two different data type there:


But, if I created a “Date Only” field in the maker portal (with “User Local” behavior), here is how that field would look like in the classic designer:


In other words, it seems those two data types in the maker portal correspond to the “Format” in the classic designer.

And, so, what I wrote about “User Local “/ “Date Only” in above is also applicable to the “Date Only” type with “User Local” behavior in the maker portal. It seems that combination should be avoided, unless, of course, you have a good example of why it should not?

Hidden fields are not “required” anymore in model-driven forms?

Coincidentally, there seem to be change in how required fields are treated in the model-driven forms. I’m saying “coincidentally”, since I recently wrote a post on a related(though different) topic:

And, then, a co-worker just mentioned that, it seems, hidden fields are not “required” anymore in the model-driven forms even if they have been configured to be “business required” and even when they are still on those forms.

I am pretty sure he is correct that such fields, even recently, would have caused a form to display a notification message (that a required field is empty), but, since this behaviors has changed… unless we both have imagined it… there is no way I can reproduce it now.

What I can show is how it works today. For example, here is a form with two required fields:


I can mark both as hidden so they wouldn’t be showing up on the form anymore:


And, then, I can create a new record (new tag in my case) without even populating those fields now:


I’m still wondering when the change happened, but, to be honest, this new(?) behavior makes total sense to me.

Image control for model-driven apps

You may have seen the announcement:


So, I was looking at it tonight and realized that I should probably take it for a spin on my phone.

After all, one of the obvious scenarios would be to have a few image fields on the form so we could open that form on the mobile device, take pictures, and save those pictures in the image fields. That would be great… would it work?

Absolutely, here is a short video:

And, from there, I can continue working with the uploaded image either on the phone or on the laptop:


That does look pretty good for all kinds of “field service” scenarios (including inspections etc), and it’s all there right out of the box.

So, well, have fun with the new capabilities!

How to: not to get stuck on the required fields when creating a Power Automate Flow

Microsoft Dataverse has a somewhat unique interpretation of what a “required” field is.

It’s a bit of “do as I say not as I do”.

For example, I can’t create an account in the user interface not having “account name” populated (since it’s a required field):


But, of course, behind the scene… whether through Excel import or through API, for instance… I can still clear those required fields:


That’s always been like this, but, now, how do we bypass “required field” validation when creating a record through Power Automate Flow? Sometimes, there is no field value I can populate through the automation.


That was bugging me for a while till I tried using “null” for such fields. Not sure what hit me this time, and why did not I do it before, but it worked right away:




Well, that will save me some troubles moving forwardSmile