Monthly Archives: November 2018

Dynamics Portals: setting up event registration web site (Part 3)

 

We’ve seen the screenshots of the event registration web site:

http://www.itaintboring.com/dynamics-crm/dynamics-portals-setting-up-simple-event-registration/

We’ve also seen how event and event registration entities are setup / exposed: http://www.itaintboring.com/dynamics-crm/dynamics-portals-setting-up-event-registration-web-site-part-2/

It’s time to bring it all together and have a look at the event details page. That’s a strange page, really. It’s not exposing a form, or an entity list. instead, it’s relying on it’s own custom-built web template which is using a combination of Liquid and JavaScript. The reasoning behind this design is that:

  • Portal users can land on that page just to have a look at the details of that event
  • However, in case they have registered for the event, there are a few things that have to happen differently on the page

 

Or, if you wish, here is what they are supposed to be able to do:

  • When a user is not logged in yet, that user should be able to see event details, but should not be able to register
  • A logged in user should be able to register for the event (or, at least, for the waiting list)
  • A logged in user who has already registered for the event should be able to un-register

 

If you recall from the previous post, depending on whether the users will be coming to that page from the event list or from the event registrations list, the page will get an ID of that event or registration through the URL parameter:

eventlist/eventdetails/?id=0772e40b-a6a9-e811-a977-000d3af49637

Now let’s see how the page is defined

Corresponding Web Page record does not expose a web form/entity form/entity list

image

Why can’t I just expose the form for the event through that web page record? This is because the ID parameter passed to that page will represent an event record in some cases, and it will represent an event registration record in other cases. In the latter scenario, entity form linkage won’t work.

Still, the page template is linked to the ita_event entity, since I still want to be able to expose a form through Liquid by using this kind of statement: {% entityform name: ‘Event Form’ %}

image

Eventually, all the work is happening in the associated Web Template:

image

Here is complete source code for that template:

 

The most important part there is that the first thing this code will do, it’ll try to figure out if that ID parameter corresponds to an event registration (by querying the event registration through fetchXml). If such an event registration exists, it will emit small javascript into the results to store event registration id for future use:

<script>eventRegistrationId = ‘{{ eventregistration.results.entities.first.ita_eventregistrationid }}’;</script>

And, then, it will replace “id” parameter with the id of the correponding event:

{% assign request.params[‘id’] = eventregistration.results.entities.first.ita_eventid %}

Once that’s been done, we can actually render the entity form:

{% entityform name: ‘Event Form’ %}

It’ll be looking at the id parameter to get the data, but, by this moment, that id parameter will already be referencing an event.

And that is the bulk of it. What’s left, and you’ll find it in the code, is a bit of javascript to

  • Change “Subscribe” button title to Un-Subscribe when it’s an existing registration
  • Hide the button completely if the portal user has not not logged in yet
  • Replace button title with “Join the waiting list” if there are no slots available for the event
  • Replace button title with “Confirm participation” if a “waiting list” user is coming back since they got a link by email once a new slot became available

 

So, depending on the context, a user can look at the event, subscribe, confirm the registration, unsubscribe, etc.

Sounds straightforward.. or did you notice a problem above?

Well, we are still talking about the event details page which is linked to an event record, and it’s exposing that record through a corresponding event form. So how can somebody do anything with their event registration just by working with the event record?

That part of processing is happening in Dynamics. Let’s have a look at the event form again – there are a few fields which are not mandatory and, at the first glance, you would probably expect them to belong to the event registration entity:

image

But this is exactly what makes it a working solution. Whenever a portal user does something on the event details page, portal user Contact Id is stored in the Contact Id field of the event, and event registration id is stored in the event registration id field(if it’s an existing registration). Remember that javascript code I mentioned above?

<script>eventRegistrationId = ‘{{ eventregistration.results.entities.first.ita_eventregistrationid }}’;</script>

There is a corresponding javascript code that will store that event registration id into a field on the portal side when required:

$(‘#ita_eventregistrationid’).val(eventRegistrationId);

Then there are those two checkboxes: “Client Confirmed”(confirming a waiting list registration) and “Subscribe” (subsribing for the event)

This brings us back to the server side, where there are a few workflows which will take that data from the event record and process it (will create an event registration, confirm it, deactivate it, etc)

So there is yet another part to this solution, which I will be describing in the next post:

Dynamics Portals: setting up event registration web site (Part 4)

Dynamics Portals: setting up event registration web site (Part 2)

 

Personally, I feel that customization of Dynamics Portals requires a special attitude. We are locked out of the server-side development other than with Liquid, we can’t use our familiar Visual Studio development environment, we don’t have access to the debugging tools.. Basically, it’s a very unusual experience.

That said, there is a lot of goodness in the portals.. as long as you don’t think of them as of a regular web application.

Just to recap quickly, here is what we can certainly do:

  • On the server side, we can use Liquid. It’s not exactly the same as C# or PHP, but it’s really fine-tuned for use with Dynamics
  • On the client side, we can use JavaScript. Again, the development experience is far not as good as in your usual dev tool (which is, I’m assuming, the Visual Studio)
  • However, where the portals really shine is in their ability to surface data from Dynamics almost immediately. You can create a list.. or a form.. link it to a the page, and the data will show up. Yet there is a bunch of configuration settings which you can throw in quickly

 

So, there is good and bad. Like I said at the beginning, Dynamics Portals require a bit of a different attitude as far as development is concerned.

In case with the event registration web site, all the work evolves around 3 web pages:

  • Event List (/eventlist)
  • Event Registrations (/eventregistrations)
  • Event Details (/eventlist/eventdetails)

I’ll describe the first two here, and the last one, since it’s somewhat more complicated, will be described in the next post.

Here is what Event List page looks like

image

Essentially, it’s a page that’s displaying the entity list. That’s a simple entity list – nothing extraordinary there:

image

All we want to do there is display the list of events, and we want to make sure that for the details view we are using “Event Details” where we will pass event id as a parameter.

Event Registrations page is not too complex either

There is another entity list for Event Registrations, so it’s linked to the web page:

image

What may look a bit strange is that I am using the same details view page for event registration that I was using for the events above:

image

Although, this time ita_eventregistrationid is passed as an ID parameter to the details view. Other than that, there is nothing extraordinary about the event registrations page either. It will be displaying more than one view, so there will be a view selector so the portal user could switch between Upcoming Events, Waiting List events, and Past Events. And yes, there is a filter condition on the Event Registrations entity list:

image

Since we want portal users to see their own event registrations only.

On the Dynamics side, here is how those two entities look like:

image

image

The real work will be happening on the Event Details page, and there will also be a few workflows in Dynamics to support that page. The page itself is using a custom web template (implemented with Liquid and Javascript), and the workflows are using my TCS Tools solution. The latter is there since I needed some advanced workflow activities which are not available out of the box. This way I was able to avoid having to do any server-side coding at all while working on the portal customizations. Well, unless you count that web template of course.

We’ll have a look at the “Event Details” page in the next post: Dynamics Portals: setting up event registration web site (Part 3)

Dynamics Portals: setting up event registration web site

 

Did you ever think of setting up an event registration system on Dynamics Portals? It’s really not that complicated, though it’s not quite straightforward either. Here is what I’m going to do in this series of posts:

  • In this first post, I’ll show you what it may look like once it’s all been setup
  • In the follow up posts, I’ll walk you through the process of setting up this kind of portal

 

So, to start with, let’s look at the back end.

In my custom solution, I have two new entities – “events” and “event registrations”:

image

Events are events, event registrations are just registration records that link contacts to those events.

Here is an example of one of those events:

image

When looking at it from the portal side, here is what I see:

image

There is an event scheduled for the 20th of December. It can only have 1 attendant, and there is still one remaining registration slot.

As a portal user, I can click on the event links, and I’ll see what that event is about:

image

So this is where I can also subscribe for the event (can only do it if I’m a registered portal user). I can choose subscribe, and the portal will bring me to the Event Registrations page where I’ll see my registration:

image

Now I just need to make sure I don’t forget that I have registered, but that has nothing to do with the portal. Or, if I change my mind, I can go to that event registration on the portal and Un-Subscribe:

image

Of course, if I wanted to see those registrations in Dynamics (as an event admin, for example), I can always do it:

image

image

That was the happy path, of course.

So what would have happened if there were no remaining slots left? Well, I would have an option to join the waiting list:

image

In which case I could see my event registration in the waiting list:

image

So if I went to the event registration page this time, the portal would tell me that it’s only a waiting list registration:

image

And yes, if somebody decided to opt out from the event (un-subscribe), then the first contact on the waiting list would get an email notification and a link to proceed so they could confirm their registration for the event. Well, if they decided not to do anything, then, two days later, their waiting list registration would be cancelled and the next contact on the waiting list would get an email.

But we’ll be getting into the details of how it all works in the follow up posts – Part 2 of this post is available here now

Windows Containers – what are they?

 

There are lots of technologies I’m not familiar with, but what I started noticing lately is that they have been evolving so fast that it’s sometimes difficult to even figure out a good starting point when trying to become more familiar with them.

Imagine waking up in the driver seat of a car moving at a relatively high speed without having any experience of driving a car at all. That’s definitely not a good way to start getting the experience – you’d probably want to learn some basics on the parking lot first.

Unfortunately, this is how I often feel now when looking at another technology, windows containers included. You keep hearing about docker containers, so one day you decide to take a closer look, and the feeling is exactly that – it’s the feeling of waking up in the moving car when you have no idea of even where the steering wheel is.

I’ve spent several hours trying to get through all the posts describing containers technology in the last few days, and, then, realized that a few things which are often mentioned simply add to the confusion. That’s until most of it finally came together, so, if you are in the same boat(or car) this post might help you, too. Just keep in mind I’m looking at the containers from the Windows perspective.

  1. Conceptually, containers are application deployment packages which include more than just the application itself. They also include the dependencies required to run the application, and all that comes with the high level of isolation. Sounds like a virtual machine? Well.. keep reading
  2. Docker has been one of the leading technologies behind containers, though, until a few years ago it was mostly about Linux (then, again, Docker has been around for only 5 years so far)
  3. You may hear that containers are not virtual machines since they rely on the OS-level virtualization(so they are not supposed to be running their own independent OS per se). It seems that’s how it used to be in the Linux world.. until Microsoft (and Docker) decided to throw in Windows containers, and that’s where things started to get wild

 

With the OS-level virtualization the idea used to be that an application running in the container would still be using host operating system. From the application standpoint, everything would look quite isolated, but, in the end, that would be the main reason why running a Linux container on the Windows host system would not be possible. You can’t make Windows look like Linux, after all.

And, then, we got two types of containers in Windows. There are windows containers(those are classic OS-level containers), and there are Hyper-V containers. The latter are probably called “containers” since they still work like any other Docker container from the Docker standpoint. But, technically, they are running as virtual machines (presumably highly optimized, but still). For example, you can run a windows containers on Windows 10, and your container can be using windows server 2016 OS.

At this point, you may want to look at these 3 articles/posts – there is lots of information there, but it should help you get your head around the main concepts, at least:

https://xebia.com/blog/deep-dive-into-windows-server-containers-and-docker-part-2-underlying-implementation-of-windows-server-containers/

https://docs.microsoft.com/en-us/virtualization/windowscontainers/deploy-containers/linux-containers

https://www.docker.com/products/windows-containers

Now, containers are great, but you need tools to deploy the containers, to monitor them, to automate various deployment steps, etc.

This is where we need to start looking at the frameworks like Kubernetes:

https://kubernetes.io/

There are others, such as Docker Swarm. But Kubernetes is the main one at this point.

Let’s say we have containers and we have some kind of framework to manage their deployment. Where are we going to deploy them, though?

This is where three leading cloud companies (Microsoft, Google, and Amazon) have something to offer, and, if you are trying to compare those offers, have a look at this post:

https://blog.hasura.io/gke-vs-aks-vs-eks-411f080640dc

At this point I would assume that, if you are are using one of those clouds for something else, you might choose a corresponding Kubernetes implementation from the same vendor. In other words, if you are a Microsoft customer utilizing various cloud services from Microsoft, look at the AKS first, since it’ll be using Azure infrastructure when deploying the containers.

Also, if you run into the term “Azure Container Services”, have a look at this post to see why it’s no more.. IT’s, now, “Azure Kubernetes Service

https://techcrunch.com/2017/10/24/microsoft-new-azure-kontainer-service-puts-its-focus-on-kubernetes/

Now what about the size of windows containers? The size of base windows containers varies from a few 100 MB-s to tens of GB-s, which you will see from the post below:

https://www.jamessturtevant.com/posts/Windows-Containers-Cheat-Sheet/

The same post actually describes a few other useful concepts, such as containers upgrade cadence.

Microsoft has its own introduction into Containers, which is really useful, though it is focused on the Windows Containers and does not expand on how they are related to the containers in general:

https://docs.microsoft.com/en-us/virtualization/windowscontainers/about/#container-orchestrators

So, from the Windows Containers standpoint, how does the licensing work? There is an answer here:

https://www.microsoft.com/en-us/licensing/product-licensing/windows-server-2016

The way I see it, we don’t need to licence containers themselves, but, depending on the licence we have for the host system, we get different rights for the containers running on that host system.

And, finally, what is the main use case? Why not to use applications,

There is more reading, though:

I am still not clear on what is the perfect use case for Windows Containers. It seems there are only two base images (windows server and nano), so, by packaging our apps into windows containers, we are simply saying that we’ll be using one of those two versions of the operating system, and, of course, we add quite a bit of “fat” around our applications. Since it’s not just the application code that we need to install now. Why not to use a virtual machine and deploy multiple apps on that machine then? Would not it actually save the resources and/or time?

You can look at these two posts to get an idea of the answer:

https://blogs.msdn.microsoft.com/msgulfcommunity/2015/09/07/why-windows-server-containers-and-why-you-need-to-look-at-containers-hands-on/

https://techbeacon.com/3-reasons-why-you-should-always-run-microservices-apps-containers

To me it seems that containers belong somewhere in between the Virtual Machines and old-fashion application deployment models. If you were to provide full VM for each of your applications, that would be one way to do it, but that would take a lot of resources. You’ll have great isolation and packaging, though. If you were to deploy an application directly on the host system, that would take the least amount of resources from the host, but that would provide the least level of isolation and/or packaging. In both cases when saying “packaging” I just mean that every application has some dependencies. For example, a web application requires some kind of web server. Different web applications may require different web servers, etc.

Containers are somewhere in between. You can package the app itself and all required components into one single container, then take that container and move it between different environment as is. Compared to a VM, you are not really getting a dedicated machine, but, compared to a single app, you are getting all the dependencies embedded into the container.

Hope that helps. Any additions/clarifications/corrections? Let me know..

EZ Change: let’s do some configuration data migration

 

It’s not all about solutions, workflows, configuration changes, etc. Sometimes, we just need to move data from one environment to the other before making any configuration changes, and, of course, we need to keep the ID-s..

image

Can we use SSIS? We can. Can we use SCRIBE? We can. But what if we did not want anything extremely complicated in terms of data conversions, yet we still wanted to move our configuration data and, at the same time, move some of the configuration changes (or, maybe, the whole Dynamics solution) right after that.

Easy.. with EZ Change of course.

In the previous two posts, we looked at a few other scenarios EZ Change can handle, so I’m just going to add the links here so not to repeat myself:

http://www.itaintboring.com/dynamics-crm/automated-deployment-and-dynamics/

http://www.itaintboring.com/dynamics-crm/automated-deployment-adding-a-new-field-and-migrating-field-data/

Let’s just look at how to move the data this time.

1. For the build actions, we need an export data action with the FetchXml to identify the data (contacts in this case)

image

2. For the import actions, we need an import data action

And we don’t want to update the records – we just want to create them and leave them be after that, so let’s also check off “Create Only” checkbox.

image

3. Build the package

4. And run it into the target environment

If you did not see some of the important records before:

image

You will see them right after:

image

Sure there are some questions here such as what happens to the non-existing references? In the exampe above that contact might have some lookups to other data. What happens to the relationships? Etc.

Well, in short, if the referenced data is not in the target environment, all those lookups won’t be populated. As for the related entities and/or N:N relationships, they will have to be exported/imported through the additional export/import actions. Which is not a problem since you can package all those into the same EZ Change package.

Automated Deployment: Adding a new field and migrating field data

Imagine this kind of (probably simplified) scenario:

  • You have added a field to the contact entity in the sandbox environment. Let’s say it’s “Saved Full Name” field
  • You have also created a workflow to copy values from the out-of-the-box “full name” field to your new custom field
  • You want to make sure that, once your updated solution is deployed in production, that new field is populated from the out of the box “Full Name” field as part of the deployment. The tricky part is that you might not be the one doing production deployment, yet the deployment itself may happen a few months later. Which means there will have to be some coordination.

This is where packaging your solution with EZ Change might be the way to go.

Here is how it works:

1. Create a new solution in Dynamics

Add Contact entity, new field, and the workflow to a new solution

image

When setting up the workflow, make sure it’s configured as an “on demand” workflow:

image

2. Create a package using EZ Change

For the introduction to EZ Change, have a look at the previous post: http://www.itaintboring.com/dynamics-crm/automated-deployment-and-dynamics/

This package won’t need any special build actions, and there will be two run-time actions. First, you would need to deploy the solution. And, then, you would need to run the workflow. So here is how the package will look like:

image

Deploy Solution Action does not require any special parameters. It will deploy the solution and publish customizations. The second action is where the data migration will be happening.

When defining that action, you’ll need to provide workflow ID and FetchXml so the tool knows which workflow to run and on which records:

image

3. Save the package and build it

Use File->Save to save the package

Use Package->Build to build the package

4. Add your package file name to the “orderedpackages.txt” file

This is how the tool will know which packages are supposed to be deployed when the time comes to deploy them

image

At this point everything will be ready for production deployment, and you will have two options to deploy in production:

5.a  Individual package deployment

You can start the tool, open the package, and run it to your environment using Package->Run menu

5.b  Cumulative deployment

You can run the tool from the command line:

ItAintBoring.EZChange.exe <Path to the packages folder> <Environment Name>

In which case the tool will deploy all packages mentioned in the orderedpackages.txt file which have not been deployed in the target environment yet (in other words, it’ll be a cumulative deployment).

In either case, as a result of this deployment you will achieve two goals:

  • A new field will be added to the contact entity
  • That field will be pre-populated from the out-of-the-box “full name” field:

image