Monthly Archives: July 2019

Power Apps ALM with Git (theory)

 

I’ve been definitely struggling to figure out any kind of sane “merge” process for the configuration changes, so I figured I’d just try to approach ALM differently using the old good “master configuration” idea (http://gonzaloruizcrm.blogspot.com/2012/01/setting-up-your-development-environment.html)

Here is what I came up with so far:

image

 

  • There are two repositories in Git: one for the code, and another one for the unpacked solution. Why two repos? We can use merge in the code repository, but we can’t, really, use merge in the solution repository. Instead, it’ll have to be “push –force” to the master branch in that repo so the files are always updated(not merged) with whatever comes from the Dev instance. Am I overthinking it?
  • Whenever there is a new feature to develop, we should apply configuration changes in the main DEV instance directly. The caveat is that they might be propagated to the QA/UAT/PROD before the feature is 100% ready, so we should try to isolate those changes through new views/forms/applications. Which we can, eventually, delete (And, since we are using managed solutions in the QA/UAT/PROD, “delete” will propagate to those environments through the managed solution)
  • At some point, once we are satisfied with the configuration, we can push (force) it to the solution repo. Then we can use a devops pipeline to create a feature Dev instance from Git. We will also need to create a code branch
  • In that feature Dev instance, we’ll only be developing code (on the feature code branch)
  • Once the code is ready, we will merge it with the master branch, will refresh Feature Dev instance from the main Dev Instance, will register required SDK steps and event handlers in the main DEV instance, and we will update solution repo. At this point the feature might be fully ready, or we may have to repeat the process again (maybe a few times)

 

We might utilize a few devops pipelines there:

  • One pipeline to create an instance, deploy a solution, and populate sample data in the Feature Dev instance (to use when we are starting to work on the code for the feature)
  • Another pipeline to push (force) unpacked managed/unmanaged DEV instance solution to GIT. This one might be triggered automatically whenever “publishall” event happens. Might try using a plugin to kick off the build
  • Another pipeline to do smoke tests with EasyRepro in the specified environment (might run smoke tests in Feature Dev, but might also run them in the main Dev)
  • And yet another pipeline to deploy managed solution to the specified environment (this one might be a gated release pipeline if I understand those correctly)

Team development for PowerApps

 

Team development for Dynamics has always been a little vague topic.

To start with, it’s usually recommended to use SolutionPackager – presumably, that helps with the source control since you can unpack solution files, then pack them, then observe how individual components have changed from one commit to another. But what does it really give you? Even Microsoft itself admits that there is this simple limitation:

image

https://docs.microsoft.com/en-us/dynamics365/customer-engagement/developer/use-source-control-solution-files

In that sense you might, of course, use git to merge various versions of the solution component files, but that would not be different from manual editing which, as per the screenshot above, is only partially supported.

The only real merge solution we have (at least as of now) is deploying our changes to the target environment through a solution file or, possibly, re-applying them manually in that environment using a solution designer.

That might be less of a problem if all Dynamics/PowerApps source artefacts were stored in CDS. But, of course, they are not. Plugin source code and Typescript sources for various javascript web resources are supposed to be stored in the source control. And even more – the solution itself would better be stored in the source control just so we don’t lose everything when somebody accidentally deletes the PowerApps environment.

So what do we do? And why do we need to do anything?

Apparently, developers are used to the best development practices, and, so, there is no wonder they want to utilize the same familiar Git workflows with Dynamics/PowerApps.

I am not sure I can really suggest anything magical here, but, it seems, we still need a way to somehow incorporate solutions into the Git workflow which looks like this:

image

 

 

 

 

 

 

 

 

 

 

 

 

https://www.quora.com/What-is-the-difference-between-master-and-develop-branch-in-Git ( although, I guess the original source is not Quora)

Come to think of it, the only idea I really have when looking at this diagram is:

  • Creating a branch in Git would be an equivalent of creating a copy environment in Dynamics/CDS
  • Merging in Git would be an equivalent of bringing a transport solution and/or re-applying configuration changes from the corresponding feature development environment to the higher “branch” environment

 

That introduces a bunch of manual steps along the way, of course. Besides, creating a new environment in PowerApps is not free – previously, we would have to pay for each new instance. If your subscription is storage-based these days, then, at the very least, you need to ensure you have enough additional storage in your subscription.

And there is yet another caveat – depending on what it is you need to develop on the “feature branch”, you may also need some third-party solutions in the corresponding CDS environment, and those solutions may require additional licenses, too.

At the very least, we need two environments:

  • Production (logically mapped to the master branch in Git)
  • Development (logically mapped to the development branch in Git)

 

When it comes to feature development, there might be two scenarios:

  • We may be able to create a separate CDS environment for feature development, in which case we should also create a source code branch
  • We may not be able to create a separate CDS environment for feature development, in which case we should not be creating a source code branch

 

Altogether, the whole workflow might look like this:

image

We might create a few more branches for QA and UAT – in that case QA, for example, would be in place of Master on the diagram above. From QA to UAT to Master it would be the same force push followed by build and deploy.

Of course there is one remaining step here, which is that I need to build out a working example, probably in devops…

PS. On the other hand, if somebody out there reading this post has figured out how to do “merge” of the unpacked solution components in the source control without entering the “unsupported area”, maybe you could share the steps/process. That would be awesome.

 

 

 

 

Public Preview of PowerApps Build Tools

 

Recently, there was an interesting announcement from the Power Apps Team:

image

https://powerapps.microsoft.com/en-us/blog/automate-your-application-lifecycle-management-alm-with-powerapps-build-tools-preview/

Before I continue, I wanted to quickly summarize the list of Azure DevOps tasks available in this release. Here it goes:

  • PowerApps Tools Installer
  • PowerApps Import Solution
  • PowerApps Export Solution
  • PowerApps Unpack Solution
  • PowerApps Pack Solution
  • PowerApps Set Solution Version
  • PowerApps Deploy Package
  • PowerApps Create Environment
  • PowerApps Delete Environment
  • PowerApps Copy Environment
  • PowerApps Publish Customizations

This looks interesting, yet I can’t help but notice that Wael Hamze had most of those tasks in his Build Tools for a while now:

https://marketplace.visualstudio.com/items?itemName=WaelHamze.xrm-ci-framework-build-tasks

Actually, I’ve seen a lot of different tools and scripts which were all meant to facilitate automation.

How about Scott Durow’s sparkle? (https://github.com/scottdurow/SparkleXrm)

Even I tried a few things along the way (https://www.itaintboring.com/tag/ezchange/, https://www.itaintboring.com/dynamics-crm/a-powershell-script-to-importexport-solutions-and-data/)

So, at the first glance, those tasks released by the PowerApps team might not look that impressive.

But, if that’s what you are thinking, you might be missing the importance of this release.

Recently, PowerApps team has taken a few steps which might all be indicating that the team is getting serious about “healthy ALM”:

  • Solution Lifecycle Management whitepaper was published in January
  • Solution history viewer was added to PowerApps/Dynamics
  • Managed solutions have become “highly recommended” for production (try exporting a solution from the PowerApps admin portal, and you’ll see what I’m talking about)

And there were a few other developments: Flows and Canvas Apps became solution-aware, solution packager was updated to support most recent technologies (Flows, Canvas apps, PCF), etc

The tooling, however, was missing. Of course there always used to be third-party tooling, but I can see how somebody in the PowerApps team decided that it’s time to create solid foundation for the ALM story they are going to build, and there can be no such foundation without suitable internal tooling.

As it is now, that tooling might not, really, be that superior to what the community has already developed in various forms by this time. But the importance of it is that PowerApps team is demonstrating that they are taking this whole ALM thing seriously, and they’ve actually stated pretty much that in the release announcement:

“This initial release is the first step towards a more comprehensive, yet simplified story around ALM for PowerApps. A story we will continue to augment by adding features based on feedback, but equally important – by continuing to invest in more training and documentation with prescriptive guidance. In other words, our goal is to enable our customers and partners to focus more on innovation and building beautiful, innovative apps and less time on either figuring out how to automate or perform daunting manual tasks that are better done automated.”

So… I’m eager to see how it’s going to evolve – it’s definitely been long overdue, and I’m hoping we’ll see more ALM from the PowerApps team soon!

PS. There is a link buried in that announcement that you should definitely read through as well: https://pabuildtools.blob.core.windows.net/docs/PowerApps%20Build%20Tools.htm  Open that page, scroll down almost to the bottom. There will be a “Tutorial”, and, right at the start of the tutorial, you’ll see a link to the hands-on lab. Make sure to download it! There is a lot of interesting stuff there which will give you a pretty good idea of where ALM is going for PowerApps.

When the error message is lost in translations

Every now and then, I see this kind of error message in the UCI:

image

It may seem useful, but, when looking at the log file, all I can say is that, well, something has happened. Since all I can see in the downloaded log file is a bunch of callstack lines similar to the one below:

at Microsoft.Crm.Extensibility.OrganizationSdkServiceInternal.Update(Entity entity, InvocationContext invocationContext, CallerOriginToken callerOriginToken, WebServiceType serviceType, Boolean checkAdminMode, Boolean checkForOptimisticConcurrency, Dictionary`2 optionalParameters)

One trick I learned about those errors in the past is that switching to the classic UI helps. Sometimes. Since the error may look more useful there. Somehow, though, I was not able to reproduce the error above in the classic UI this time around, so… here is another trick if you run into this problem:

  • Open browser dev tools
  • Reproduce the error
  • Switch to the “Network” tab and look for the errors

There is a chance you’ll find a request that errored out, and, if you look at it, you might actually see the error message:

image

That said, I think it’s been getting better lately since there are errors that will show up correctly in the UCI. Still, sometimes the errors seem to be literally lost in translations between the server and the error dialog on the browser side, so the trick above might help you get to the source of the problem faster in such cases.