MS Certifications – Sharing my Mind Maps

Over the last 15 months, I have completed 5 Microsoft certifications and have used various sources on the internet to get me through.

My method of remembering facts relies on completing a mind map. This way, I note down the key points (key to me anyway) for a subject and link them together, serving as a quick refresher on the day but forces me to fill out my knowledge as I research the content and develop my understanding.

Please don’t think that memorising these maps will be all you need to do to pass any of the exams. I share them to assist you in getting that certificate and maybe trigger something that you don’t understand so you can research that area more effectively. I also don’t guarantee that they are all accurate. The vagaries of time and my ineptitude will ensure they are not.

It goes without saying that practice is the best way of ensuring you have the right understanding of the product, but I have found that there are always areas where you have not come across, even if you have been working on an application for years. Microsoft never stops developing.

I will keep this page up to date as I complete more exams with the mind map and any resources I used, with the exam I took most recently at the top.

All the mind maps can be found here. I use Freeplane as my mind mapping tool of choice, because it is open source, free for unlimited maps, allows tweaking of styles etc and is not trying to sell you it’s bigger brother. There are others out there which offer increased functionality, but this works for me. You can import Mindmap files into most other services.

MB-240 – Field Service

Mind Map

Field service was a stretch goal for me, as I have never used it in anger on a client site. It is a relatively new product in the suite, but one that is getting a lot of traction in the D365 community.

I went into this exam thinking I was not prepared, but managed a decent pass with my learning. The exam itself is quite short, but did stretch my understanding.

My first resource, as always was Neil Parkhurst. He has a great series of posts on this exams predecessor, MB2-718 Whilst being slightly dated, it is still relevant.

Further, I went through the Microsoft Learn topics on Field Service here

MB-200 – Customer Engagement Core

As I have been working with D365 a while, I was very lazy when going into this one. I just assumed I knew it all, brushed up on the new stuff and went for it. Thankfully, this didn’t bite me in the ass.

I didn’t do a map for this one, but I reviewed the map for MB2-716 here.

I also review Neil’s posts on this one and @NZ365Guy videos on the subject on the OpenEdx course which are a great, particularly for the new stuff in the application, like Flows.

AZ-203 – Developing Solutions for Azure

Mind Map

I consider myself to be a lapsed developer. I love coding, but my career has taken myself into more of an advisor and designer than a “coder”. This exam was proof to myself that I could still run with the cool kids and also exposed me to a lot of the azure stack that I did not know.

As I was preparing for this exam back in January, to get an Azure Developer Associate certification, it swapped from 2 exams to 1, AZ-200 being the first, that is why they references to AZ-200 linger.

For this exam I was indebted to the course by Pluralsight which I am thankful that I had access to.

From a D365 developer point of view, this was a tough one. It was a step above and beyond what I expected. There is so much content, each area to a decent understanding level it really taxed me. This is the first exam to push the time I had available to complete it, with a large number of questions and a lot of deliberation.

MB2-718 – D365 for Customer Service

Mind Map

This was the last of the older exams I took before Microsoft revamped the exams. Neil Parkhurst has an excellent blog on this one, which was my source. I have been doing service for quite a while, but the intricacies around SLAs and Queues were something that I had to learn. It also contained Unified Service Desk and Voice of the Customer, both subjects I had not come across.

MB2-716 – Customisation & Configuration

Mind Map

Another old exam, and something I take pride in as it is my bread and butter. This was another re-cap of my understanding, particularly around the bits where I would just Google when I came across it in my day job. Neil Parkhurst again provided the detail which saw me through. Bits on auditing and configuring email etc. were items that I knew the fundamentals, but Microsoft has a knack of slightly tweaking the wording giving a separate answer, so it is vital you know your stuff.

MB2-715 – D365 Online Deployment

Mind Map

I am fortunate to have come to D365 after online was the chosen deployment method. I have not got nightmares like some older community members around installing and configuring on-premise solutions. This is another exam that is part of my day-to-day role so it was just a matter of brushing up on where I did not have enough knowledge.

Neil Parkhurst (who doesn’t owe Neil a beer?) has it covered again. There is a lot you take for granted here, that you need to get to grips with, such as licensing, what you need to do in the Office Admin portal vs D365 admin, Email configuration and integration with other apps like SharePoint.

Advertisements

Cloning Flows: Location triggers for everyone

Sometimes ideas don’t work out. This is one of these times. But the reason I blog is to learn, expand my knowledge of the PowerPlatform, expand my knowledge of components outside of it. So, I figured I would blog about my failure, learning is learning. As I started testing the flow again, moving environments etc, it started working. I guess this is down to the location trigger being a work in progress. Moral of the story: If it is broke last month, try again this month.

Back in July, I started working on this scenario, but couldn’t get it working. I noticed @Flow_Joe_ & @JonJLevesque did a video walkthrough of using the Geofence trigger to send out a summary of the customer when a sales person enters an area, which reminded me of my failure, hence why I have written it up. While from Joe & Jon’s video shows us how easy it is to create a flow, for Salespeople in general, I think this is too far. You can not expect a Salesperson to have any interest in creating flows to do this, you can expect them to click a button on a form within their D365 application.

Objectives

  • The Scenario
  • Creating the Flow button
  • Cloning the Flow
  • Outcome

The Scenario

Numerous times when I have been a customer, a salesperson would come to us not knowing that we have several major cases logged with them against their product. This is mainly down to lazy sales people (I know, they don’t exist), but it would be awesome for the salesperson to get a summary of the account when they get in the door of a customer. The number of support cases, a list of the open opportunities and orders, any complaints that have been logged. All of this information is available to the salesperson via the D365 mobile app, but it would be good to ensure that they get this information and are less likely to get caught out by a customer venting at them for 5 critical bugs that have been sat around for a month.

The Solution

Flow has a new trigger, still in preview, Location, which is triggered via the Flow application when an user enters or exists an area. This is perfect for our scenario, stick a GeoFence around a customers location, when the user enters the area, it gets triggered. Look up the Customer, format an email and send it to the user.

Flow is user friendly, a low code solution, but you can not expect a salesperson to create a flow for each account they want to create this trigger for. What can be done, is put a button on a form, automatically create a Flow for the user against the account they have selected which would then be triggered when the user enters the location.

There are 2 separate series of flows that are required, firstly to start with an action from the user on the account record, which triggers cloning of a template.

The second series is the clone of the template, which triggers sending the salesperson the relevant information when they enter the customers property.

Creating a Flow Button

Starting with a CDS “When a record is selected” trigger, configure it to be used when an account is selected.

The next step is to retrieve who is running this flow. As mentioned, it will publish this button on a Account form, so it is essential to know who is running this, so an email can be sent to them. The account information and who the user is is sent as the body to a HTTP Post trigger, which is the next flow in the chain.

An HTTP trigger is used because the next Flow requires enhanced access. An admin user needs to clone a Flow, which you would not want a normal user to be able to do. The admin is used as well to ensure any runs that happen are legitimate. The admin or sys account shouldn’t belong to someone who could have Flow in their pocket.

To have the URL to send to, the next Flow needs to be created first, but just to show where this button appears within the D365 interface. The first time we run it, there are few confirmations that you need to do, finally you can run the flow.

Cloning the Flow

This flow clones an existing template, tweak it slightly and gets it up and running as the user.

Starting with an HTTP Trigger, I use a sample payload to build the schema.

Next is retrieving the account. As the account id is passed in from the calling Flow, a simple Get Record is used.

Next, configure the name of the Flow that will be created, making it unique for the user by adding their email address in. A flow definition string is also initialised for later

In this Flow, the user that called it from the button is needed, so it retrieves the profile using the Office 365 Users action.

Next, retrieve my template flow. Flow has several actions around management of Flows, which are incredibly useful to a Flow administrator. The template flow is a simple flow which has a location trigger and a call to a http trigger to call a secondary flow. I will discuss later the detail about this.

The next couple of actions try to determine if a flow with the FlowName defined already exists, firstly by getting a list of all my flows (as an admin) then getting a list of Flows in the organisation, then filtering it with the flowname that was defined in the initial steps

If there is a flow already, just stop. If not, carry on & clone the template flow.

The Template

The Log Template is a very easy, small location trigger with an HTTP call action. The HTTP call passes in the user’s location and the account id and the user who started the process. Both email and account will be swapped out as part of the clone.

The trigger region is essential for any location trigger. It triggers this one of the Microsoft campus in Redmond. Someday I will be fortunate to go to the motherland. I chose this as it is not likely that the user would have them as a client, but it doesn’t really matter where you chose, as what you need is the latitude and longitude from it so you can replace it when you clone the flow.

If you click on the peek code button against the trigger, it shows a JSON representation of the trigger. The latitude and longitude are that of the Microsoft office and this is the bit I need to replace

Cloning the Flow (part 2)

All a Flow is a JSON file. Obviously, how it is rendered and how the hooks and actions work are the power, but the definition is a JSON file. Using this knowledge, we can create a new version of the template, with a location specific to the account.

The template in all it’s glory is below. Just using simple find / replace, we tweak it to the specific location, account and user.

{
  "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "$authentication": {
      "defaultValue": {},
      "type": "SecureObject"
    }
  },
  "triggers": {
    "manual": {
      "type": "Request",
      "kind": "Geofence",
      "inputs": {
        "parameters": {
          "serializedGeofence": {
            "type": "Circle",
            "latitude": 47.64343469631714,
            "longitude": -122.14205669389771,
            "radius": 35
          }
        }
      }
    }
  },
  "actions": {
    "HTTP": {
      "runAfter": {
        "Initialize_Email_Variable": [
          "Succeeded"
        ]
      },
      "type": "Http",
      "inputs": {
        "method": "POST",
        "uri": "https://prod-68.westeurope.logic.azure.com:443/workflows/<GUID&gt;/triggers/manual/paths/invoke?api-version=2016-06-01&amp;sp=%2Ftriggers%2Fmanual%2Frun&amp;sv=1.0&amp;sig=<SIG&gt;-JQQvYT0",
        "body": {
          "lat": "@{triggerBody()?['currentLatitude']}",
          "long": "@{triggerBody()?['currentLongitude']}",
          "user": "@{variables('Email')}",
          "account": "@{variables('accountId')}"
        }
      }
    },
    "Initialize_Account_Variable": {
      "runAfter": {},
      "type": "InitializeVariable",
      "inputs": {
        "variables": [
          {
            "name": "accountId",
            "type": "String",
            "value": "<accountId&gt;"
          }
        ]
      }
    },
    "Initialize_Email_Variable": {
      "runAfter": {
        "Initialize_Account_Variable": [
          "Succeeded"
        ]
      },
      "type": "InitializeVariable",
      "inputs": {
        "variables": [
          {
            "name": "Email",
            "type": "String",
            "value": "<email&gt;"
          }
        ]
      }
    }
  },
  "outputs": {}
}

Back on the clone flow, the next step is to convert the template to a string. This makes it easier to replace the latitude, longitude etc. with the ones we want.

On the account OOTB record there is a latitude and longitude. This data is not normally populated, but it is used by Field Service and other applications. I used Field Service to populate it using the Geo Code button.

As you can see from the above, Field service populates both latitude and longitude to 5 decimal places. This is common precision when you use any mapping software such as Google. so I am not sure why if you do the same by the Flow trigger you get precision to 15 dp for latitude and 17 for longitude.

The next 2 steps are because of me trying to get the flow to work. One of my thoughts was that the flow was expecting the all 15 of the decimal places to be populated, so these steps pad out the number you have against the account with additional numbers.

The expression is the same for both

concat(string(body('Get_Account')?['address1_latitude']),'111111')

The next step replaces the newly calculated values for latitude and longitude in the JSON definition

replace(replace( variables('flowdefstring'),'47.643434696317136',outputs('Replace_Lat')),'-122.14205669389771',outputs('Replace_Long'))

The accountid is also replaced. This is used in the cloned flow to define which account the user selected. The trigger only gives you the user’s current location, not the centre of the circle you configured. You could use these values & find the account, with difficulty, unless there is something I am missing. I prefer to add a variable in the clone, which is the account id.

replace(outputs('Replace_Lat_Long'),'<accountId&gt;',triggerBody()?['Account'])

The same with the email to send to, it should be the user who triggers the geofence, but seems to be the person who is the admin. As I clone the Flow with an admin account then add the user as an admin, it runs under the admin account.

There is enough info now to create this flow. Using the Create Flow action, the new flow is created and up and running.

I use a JSON expression to convert the string I have used to find / replace the latitude, longitude etc. to ensure the Flow is created with JSON.

json(variables('flowdefstring'))

The final step is to add a Flow owner. As the sales person who triggered the flow is who it should trigger on, make them the owner, so it should run under their context.

Outcome V1

Ignore this bit if you want to avoid the author moaning about stuff that doesn’t work.

If I run the whole flow, I do generate a new Flow.

Going into what was generated, using peek code again, you can see that the Microsoft location has been replaced with the Avanade office

The trigger is active, but this is where it stops. I can not get this to trigger to fire. Changing the location to my home, going for a walk, coming back doesn’t trigger it.

If I don’t put in the padding for the latitude and longitude, it doesnt trigger.

If I clone from my location, not changing the latitude and longitude, still the trigger doesn’t fire.

If I configure a new trigger from scratch, that works.

Everything about the trigger look the same when you get it in code view, but there must be something different.

This is where I started reaching out, I tweeted about it to the gods of flow and asked in the Flow forum where I did get a response, basically saying the same, and that the location trigger is in preview.

So, if you have got this far, how do I fix it?

Outcome V2

Like I said at the outset, this didn’t work for me. Frustration set in, and I forgot the idea. But, as I was putting together this blog post, re-deploying the components as my demo system had expired, it worked!

So, moving on, we need to sent an email to the user with the playbook for the account. I want to list the last 5 critical cases, last 5 open opportunities, last 5 notes and any description the user has put in.

It triggers an HTTP request, the schema defined by a sample payload, but contains who triggered the workflow and which account.

Then, a great time for a parallel branch. The Flow retrieves the cases, notes and opportunities in a parallel branch.

Each branch does a similar thing, looking at the Notes branch, firstly retrieve the records with a CDS List Records action, using an OData filter and order by, return the top 5 only.

Next, put this in an HTML table, selecting the output from the Get Notes action. I select Advanced option, then Custom columns, this way I can define the order and which columns I want to display.

The final step is to send an email

Obviously, this can be customised to your business need, but my example list the cases, opportunities & notes, and reminds them to fill in a contact report.

Summary

So, the user selects a button on an account form, which allows them to receive updates about one of their customers when they enter the location of the account. Easy.

I tested this with my home address and with a different user and you can see that I get the email through. Veronica is in the US, I wasn’t up at 1am writing blogs & fixing Flows.

You can also see that Flow notifies the user that it has made them an administrator on a Flow.

This Flow starts with a Flow button on a record, making it a user-initiated process. It could be triggered off a record creation process – If the user follows an Account, create this automation for them, as long as they have opted in.

There is location tracking in the Field Service application, but that requires the Field Service mobile app and not suited to a Sales person. They just need to install the Flow app on their device and forget it is there.

AI Builder – Text AI

My blogging journey started with using LUIS, one of Microsoft’s Cognitive Services to automate case assignment. This blog goes into detail about how this all hung together, using a model defined in LUIS, calling the LUIS endpoint when a new cases are created and classifying the case, by the subject, with the result from the call.

After my summer break (sorry, but family etc comes first) I thought I would revisit this scenario, but using one of Microsoft’s shiny, new AI Builder capabilities, Text Classification AI Model.

Objectives

  • The Scenario
  • Training your Model
  • Getting Data
  • Publishing the Model
  • Using the Tags

The Scenario

In my first blog, I went through the scenario, so not wanting to repeat myself, but for the lazy who don’t want to click through…..

Big Energy is a supplier of energy products to end users. They have a call centre which handles any query form the customer. As a perceived leader in the sector, it is always wiling to use the latest technology to allow users to interact with them, which reduces the pressure on the customer support centre.

Big Energy has a mail box configured to accept customer emails about anything and, rather than have a group of 1st line support employees filtering out and categorising the emails based on the content, want to use cognitive services to improve the process of getting the email (the generated case) to the right team.

Using AI to file the case

LUIS does a great job of this, with a BA providing sample utterances for the model and training it.

Text Classification AI Model does it slightly differently. The model expects users to provide data (in the CDS) in the form of text blocks and representative tags for the data. Both need to be in the same entity in CDS.

On a standard Case record, the classification or tag is the subject field. This is a parent record of Case and the tag would be the name of the subject. As subject and case are separate entities, the Text Classification AI model will not work. A field, be it a calculated one, has to be introduced to enable the classification AI to work. Adding data to an entity from a parent entity breaks my Third Normal Form training (anyone remember that? Is it still a thing?).

I have raised this issue as a new idea on the PowerApps ideas forum, go there and give it a vote!

The new logic for our AI model is that the AI will classify the incoming case, adding a tag. This will trigger a flow, changing the subject of the linked case accordingly. This will trigger re-routing of the case like it did in the original LUIS method.

Training your AI

With any AI model, it needs training. The AI needs to separate the wheat from the chaff. Creating a model is simple in PowerApps.

Start at make.powerapps.com and select AI Builder, then Build

There are 4 options here

Binary Classification is useful to give a yes / no decision on whether data meets certain criteria. The criteria can be up to 55 fields on the same entity. For example, is a lead with a low current credit limit, high current account value, no kids but has a pink toe nail (shout out to Mark Christie) likely to get approved for a new loan?

Form processing is intended to assist users in automated scanned documents to prevent re-keying. An example would be any forms hand written as part of a sales or service process (before you convert to a PowerApp obviously).

Object detection assists in classification of items, be in types of drink, crisps or bikes, etc.

Text classification decides on a tag for a block of text, for example, a user could enter a review of a product online and text classification could understand what product it was for or whether it is a positive review.

All 4 of these have origins in the Cognitive services provided by Azure, LUIS being the big brother of Text Classification.

Ensure you are in the correct environment. Text Classification only works on data within your CDS environment, so don’t expect to reach out to your on-premise SQL server. There are ways to bring data into CDS, not in scope for this discussion.

Selecting Text Classification displays a form to give you more understanding, and it is here that you name your model

Hit Create and then Select Text. This will list all your entities in your CDS environment (in my case, a D365 demo environment).

Select the entity you want, Case for our PoC.

The interface will then list all the fields suitable for the AI model, namely anything that is a text field. I chose the description field, which is typically the email that the user enters when emailing in a case to the support department.

Hit the Select Field button and it will present you with a preview of the data in that table.

The next screen is to select your tags. This needs to be in the same table, and as already discussed, is a bit of a limitation to the AI builder. Less normalised data is more common in Canvas apps or SharePoint linked apps, but for structured data environments with relationships and normalised data this is a limitation that will hopefully be removed as Text Classification matures.

Also, option sets are not available, again another common categorisation technique. Multi-select option sets are an ideal tagging method too. Assume that this will come in time.

For my PoC, I created a new field, put it on the Case form and started filling it in for a few records.

Select the separator. If your tag field contains multiple tags, separated by a comma or semi-colon, this is where you configure it.

It also gives you a preview of what the tags the AI build would find using your chosen method. You can see in the No separator option, “printer; 3d” is one tag, rather than the assume 2 tags as displayed if semi-colon is selected. This depends on your data.

The next page displays a review for your data and the tags that the AI builder finds.

Next, select a language for the text field dependent on your data.

Once selected, train your model. This is where I started to run into problems. My initial population of tags was not enough. The training came back quickly with an error. There should be a minimum of 10 texts per tag, which I didn’t have. That would be hundreds of rows. How was I going to automate creating data to give the Text AI enough data to be a suitable demo?

Getting Data

I need thousands of records to train my model properly, thousands of records relevant to the tags I create. No online data creator seemed suitable, as it wasn’t specific enough, so how? A flow.

First I created a column in the Contact table to store a number for my contact, a unique no so I can randomise the selection of a contact.

Next, I need some data for the case description and the tags. This is already done as it is the same as the utterances and intents I used for LUIS, so I exported the LUIS configuration, put the data in an excel file & added a number to that.

Ready for the Flow

My simple flow is described below.

Ask for the number of cases to create, keep creating cases until you have reached that limit using a random contact and a random description.

This flow is triggered manually so I start with a manual trigger and also prompt for the number of cases to create,

The Subject variable is used later to define the reference for the subject we have chosen.

The default for loops is 60. I realised late on in the day that you can change that, but breaking up loops is good practice, to limit the scope of what could go wrong, so created a loop within a loop structure for my flow.

I restrict the inner loop to 50 loops maximum, which means the number of times I run this loop has to be calculated. If I want a 920 cases created, my outer loop would occur 45 times, each creating 50 cases. I would then do a final set for the rest.

The next steps will initialise some counters used in the loops. I also want to ensure that if the user wants to create less than 50 records, the outer loop doesn’t run at all.

The outer loop will run for the number of loops I have calculated. This is the loop condition. The counter increments as the last thing in the outer loop. The first thing in my outer loop is to reset the case counter. This is the counter for the 0-50. If we are in this inner loop, at least 50 cases will be created.

The first thing it does is get a random contact by using a odata filter to filter on the number field created specifically using a random number from 0-875 (875 being the highest number in that table).

Once the contact is retrieved, find a random description / tag combination. The data from the LUIS utterances is held in an Excel file on a Teams site. Again, a rand() function takes a random number up to the maximum in that file.

Because more than one subject row could be returned and the fact I don’t like apply to each inside each other, set the subject Id variable.

Ready to create a case now.

Nothing special. It also populates the tag field.

After some testing, to ensure that the case has the necessary fields entered, the flow was run for a thousand records without an issue.

Creating data this way isn’t quick, 20 mins for 1000 records, but it is easy and allows you to bring in reference data quickly. Superb for PoC environments.

Training your Model (with data)

Once this data is generated, it was time to re-train my model. It ran through with success this time.

The model is 97% sure that whatever I throw at it, it should be able to match it against the tags. There is a quick test option here too, which allows entry of a sample phrase to check your model

All ready to publish.

Publishing your Model

Publishing the model allows it be used within Flow and PowerApps.

Clicking Publish generates a child table of the entity you first chose where the predictions are stored. The documentation states the table will be TC_{model_name} but it created mine with gobbledegook.

The link on the form helpfully allows you to go straight to the entity in the new customisation interface, where you can change the label of the entity.

Also, it is useful to change some of the views, particularly the associated results view. By default it includes name & date, which is pretty useless, so add the tag and the probability.

As this is a child table of Case, it is by default visible in the case form Related navigation item. In the classic customisation interface, you can change the label of this view.

As it is published, users can use flow and the Predict action to predict the tag for a given section of text, useful if you want to do stuff before it reaches an environment.

Now that it is published, you need to allow the model to run. This means it runs every time there is a change to the text field. This is all done via Flow, so will use your flow runs. It stores the result in the new entity.

If a case is created now, it automatically creates the tag secondary record.

Using the tags

As AI builder generates a record for you with its prediction, and the data is in CDS, it is a simple Flow to utilise that. As it creates a record in the AI Tags table, update the corresponding case to change the subject accordingly.

Simple trigger when a record is created. The first action is to find the subject from the tag.

Update the case record with the subject and the tag so the AI can be retrained later.

That’s it. Replacing LUIS with a more user friendly environment is definitely a tick in the box for Microsoft. The AI in PowerApps feels like a simple, user friendly stepping stone for a lot of businesses into the AI world. Hopefully, businesses will embrace these simple models to leverage tools to shortcut processes, improving Employee and customer experiences.

User Admin – Published App

After being asked on LinkedIn to publish both the apps that I built for the User Security Admin walk-through I have done so on Dynamics Communities Power Platform Bank

Stand-Alone Security / User App

The first one, which if you remember is a stand-alone application detailed here can be downloaded here

https://dynamics365society.uk/powerappsbanklist/dynamics-ce-security-user-profile-powerapp/

Embedded Security App

The second app was the one after I converted the original to a embedded form in the User record, detailed here

https://dynamics365society.uk/powerappsbanklist/dynamics-ce-embedded-security-powerapp/

Both apps require the custom connector to read and update teams and role, which are included in the package.

Please let me know if it works for you via Twitter or LinkedIn

Thanks goes to Those Dynamics Guys for the great Dynamics Community & the PPB as well as Joergen Schladot to giving me the kick up the arse to get it done.

User Admin PowerApp (Part 4)

As putting a canvas app on a model driven form is now out of preview I thought that my PowerApp to add security roles and teams might be a suitable candidate to be migrated to an embedded canvas app.

Not going to repeat

There are lots of videos and blog posts out there which detail how to embed a canvas app into model, so there is no point repeating that, just stand on their shoulders.

Needless to say, there are a few gotchas. I will walk through how I addressed each one.

The connection isn’t there straight away

Rather than displaying the first user selection screen, the embedded app goes straight to the display of user roles and teams. The user selected should come from the record the end user is on. In the original code, I used the selected record, which is available as the user clicked on it.

From the embedded app, there is a connection to the underlying record, using the ModelDrivenFormIntegrationData property. I found that this was not populated straight away, prior to form visible anyway. To get around this, and I think Scott Durrow demoed this in a D365 User Group meeting, you need a wait screen.

The Timer has configured to have a duration of 2000ms, with Auto start enabled. On Timer End Event is a simple navigation to the actual start screen.

I found that this was more than enough to get the data available in my application, so that the On Visible on the main screen could do all the things it needed to do.

Don’t forget to publish

The only way you can really see what is going on in your app is to run it within the D365 form. The debugging of the application is a little contrived when using the embedded app, so I used a liberal amount of labels with string values in there to understand what is being set and which value is being used.

Fundamentally, you need to publish everytime you need to see the application in your environment. Without the publish, the previous version is shown.

Screen size wows

This app is now configured with landscape mode, with Scale to fit, locked aspect ration and locked orientation. This seems to work more effectively for higher resolution screens on Windows machines. The screen scales with its size, which is great, but I would like to see the option for better use of the real estate.

As you can see, when the screen is large, the buttons and text in the PowerApp gets very big, over the top for mouse users.

If you change the screen properties to remove scale to fit, and alter some of the properties to ensure proper positioning as the screen moves, you get a more appropriate display for large screens, but are limited on tablets etc. Know your audience I guess.

Back to the code

The first part is to create a variable with the user data in it. ModelDriverFormIntegration will always be a table, all be it with only row in our case. Using a variable, this single record is available elsewhere. Just a method of not having to repeat that long property line throughout the application.

Set(
    userdata,
    First(ModelDrivenFormIntegration.Data)
);

The next step, was copy and paste from the other version, replacing where the app is expecting a selection of a user with the user id taken from the userdata. Firstly, creating a FetchXML string to derive the teams for the user, then passing this into the flow connector to retrieve the teams.

   Set(
        teamstring,
        "<fetch top=""50"" &gt;
  <entity name=""team"" &gt;
    <attribute name=""name"" /&gt;
    <filter&gt;
      <condition attribute=""isdefault"" operator=""eq"" value=""0"" /&gt;
      <condition attribute=""systemmanaged"" operator=""eq"" value=""0"" /&gt;
    </filter&gt;
    <link-entity name=""teammembership"" from=""teamid"" to=""teamid"" intersect=""true"" &gt;
      <filter type=""and"" &gt;
        <condition attribute=""systemuserid"" operator=""eq"" value=""" &amp; userdata.SystemUserId &amp; """ /&gt;
      </filter&gt;
    </link-entity&gt;
  </entity&gt;
</fetch&gt;"
   // Stick it in a collection for display on the form
    );
    ClearCollect(
        teams,
        D365FlowConnector.GetTeams(teamstring).value
    );

The format of the application has changed, to a landscape view. I made both the teams and roles grid visible, with a search field at the top. The Teams grids items is set to the collection, with filtering and sorting applied

Next in the on visible event, the FetchXML is built to help in adding teams to the user. Firstly, create a table with the snippets in for the existing teams filters, then creating the actual FetchXML

Set(
        teamTable,
        ( ForAll(
            teams,
            "<condition attribute=""teamid"" operator=""neq"" value=""" &amp; teamid &amp; """ /&gt;"
        ))
    );
    
    // create the fetchXML to restrict the teams to those that user has not already got
Set(
        teamsNotGot,
        "<fetch top=""50"" &gt;
            <entity name=""team"" &gt;
                <attribute name=""name"" /&gt;
                <filter&gt;
                    <condition attribute=""isdefault"" operator=""eq"" value=""0"" /&gt;
                    <condition attribute=""systemmanaged"" operator=""eq"" value=""0"" /&gt;
                </filter&gt;
                <filter type=""and"" &gt;" &amp; Concat(
            teamTable,
            Value
        ) &amp; "</filter&gt;
            </entity&gt;
        </fetch&gt;"
    

The process above is also done for the roles. The one gotcha around roles is that there is a business Id field user, but that gives you the name of the BU. If you use the BusinessUnitIdName field, that actually holds the GUID of the BU. Seems counter intuitive to me.

    Set(
        rolesNotGot,
        "<fetch top=""50"" &gt;
                        <entity name=""role"" &gt;
                            <attribute name=""name"" /&gt;
                            <attribute name=""roleid"" /&gt;
                            <filter type=""and"" &gt;" &amp; Concat(
            rolesTable,
            Value
        ) &amp; "</filter&gt;
                            <filter&gt;
                            <condition attribute=""businessunitid"" operator=""eq"" value='" &amp; userdata.BusinessUnitIdName &amp; "'/&gt;</filter&gt;
                        </entity&gt;
                    </fetch&gt;"
    )

The final app I think is a much better interface to manage roles and teams for a user. It works well in this context and allows managers to quickly change the roles for who they are concerned with.

Adaptive Cards – Improved Approvals (Part 2)

Continuing on a walkthrough of creating a more effective adaptive card for approvals, this part will describe the flow I created to generate the card as well as complete the action in D365 depending on the response

Objectives

  • The Scenario (Part 1)
  • Preventing progress of an Opportunity ( Part 1 )
  • Using Flow to create a basic Approval ( Part 1 )
  • Creating an Adaptive Card ( Part 1 )
  • Using Flow to create the Approval (This Part)
  • Updating the Opportunity (This Part)

Starting out

As previously described, the Flow is triggered when a user updates the Develop Propsal checkbox. In the first stages, the flow also retrieves some records that are needed later on for population of the card. There are also initialisations of 2 arrays that are used to populate the approvers and product lines on the card.

The next section is used to retrieve the approvers for the territory. In part 1, a many to many relationship was added, linking User to Territory via the territory approvers table.

As the territory approvers table is a many to many relationship, it does not appear as a standard table in the common data service connector, nor the D365 connector. There are various blog posts out there which state you can just use a custom value, naming the table, but I couldn’t get it working, so I fell back to my custom connector.

In my previous post on Security roles via a PowerApp, the custom connector which allows an FetchXML string to be sent against an object is used a lot to get the teams and the roles for a user. This connector is again used to find the users associated with a territory via the new relationship. The FetchXML is below.

<fetch top='50' &gt;
  <entity name='systemuser' &gt;
    <attribute name='internalemailaddress' /&gt;
    <attribute name='fullname' /&gt;
    <link-entity name='cc_territory_approver' from='systemuserid' to='systemuserid' intersect='true' &gt;
      <filter&gt;
        <condition attribute='territoryid' operator='eq' value='@{body('Get_Account_Manager')?['_territoryid_value']}' /&gt;
      </filter&gt;
    </link-entity&gt;
  </entity&gt;
</fetch&gt;

 This will return JSON which corresponds to the users linked as approvers to the territory.

[
  {
    "@odata.etag": "W/\"3421832\"",
    "internalemailaddress": "veronicaq@CRM568082.OnMicrosoft.com",
    "fullname": "Veronica Quek",
    "systemuserid": "824da0b2-6c88-e911-a83e-000d3a323d10",
    "ownerid": "824da0b2-6c88-e911-a83e-000d3a323d10"
  },
  {
    "@odata.etag": "W/\"1742271\"",
    "internalemailaddress": "danj@CRM568082.OnMicrosoft.com",
    "fullname": "Dan Jump",
    "systemuserid": "e3b305bf-6c88-e911-a83e-000d3a323d10",
    "ownerid": "e3b305bf-6c88-e911-a83e-000d3a323d10"
  },
  {
    "@odata.etag": "W/\"3422353\"",
    "internalemailaddress": "CarlC@CRM568082.onmicrosoft.com",
    "fullname": "Carl Cookson",
    "systemuserid": "113f1e3a-db90-e911-a822-000d3a34e879",
    "ownerid": "113f1e3a-db90-e911-a822-000d3a34e879"
  }
]

 An approval needs a list of email addresses, separated with a ; . To achieve this, firstly put each of the returned email addresses in an array, then use the Join function to create the string used for approvers

Populated the Main approval

The next part the body of the approval that is going to be sent. I’ll link the full version of this at the end of the article, but effectively, you copy your design, remembering to insert appropriate dynamic content on the way.

Here, I create the 2 URLs that are displayed in the card, which combine the starting point of url and append Account or Opportunity Id.

This is displayed at the top of the card.

Further, formatting currencies is difficult in Flow. (I stand to be corrected). I found this post on Power Platform community which highlights the issue and degvalentine has the solution, which I have tweaked to take into account of null values in the fields in D365. This example is for one of the fields on the secondary grid.

if(empty(string(items('Add_to_Prod_LInes')?['manualdiscountamount'])), '0',
concat(
  if(
    greaterOrEquals(
      items('Add_to_Prod_LInes')?['manualdiscountamount'],
      1000
    ),
    concat(
      substring(
        string(items('Add_to_Prod_LInes')?['manualdiscountamount']),
        0,
        max(0, sub(length(first(split(string(items('Add_to_Prod_LInes')?['manualdiscountamount']), '.'))), 3))
      ),
      ',',
      substring(
        first(split(string(items('Add_to_Prod_LInes')?['manualdiscountamount']), '.')),
        max(0, sub(length(first(split(string(items('Add_to_Prod_LInes')?['manualdiscountamount']), '.'))), 3)),
        min(3, length(first(split(string(items('Add_to_Prod_LInes')?['manualdiscountamount']), '.'))))
      )
    ),
    first(split(string(items('Add_to_Prod_LInes')?['manualdiscountamount']), '.'))
  ),
  '.',
  if(
    contains(string(items('Add_to_Prod_LInes')?['manualdiscountamount']), '.'),
    concat(
      last(split(string(items('Add_to_Prod_LInes')?['manualdiscountamount']), '.')),
      if(
        less(length(last(split(string(items('Add_to_Prod_LInes')?['manualdiscountamount']), '.'))), 2),
        '0',
        ''
      )
    ),
    '00'
  )
)
)

Populating Product details

As the approval body is built up, the next stage is to create a table with the product lines in it. Getting the lines is a simple filter query using the primary key on Opportunity.

Like with the approvers, an array is populated with a formatted version of each line, taking fields returned and combining them with formatting rules.

The first expression deals with the fact one of the products chosen to demo had a double quote (“) in it, which messes up JSON if it isn’t escaped as it is the string delimiter. I used a simple replace expression to add a “\” before it.

replace(items('Add_to_Prod_LInes')?['productname'], '"','\"')

The next expression is the one above to format the currency with the appropriate commas and decimal places.

The output of this looping of the product lines is then combined using a join again, then combined with the body main string.

The bottom of this string starts the list of actions, which are the buttons.

The next step is to create the Approval. This is pretty simple, using a first to respond type, and fleshing it out a bit, so if a user uses the standard Flow Approval interface, they have something to relate to. No notification is needed, this will send an email to the approver, but the Flow will alert the approver via Teams.

My original design for this PoC was to push this notification / approval to a Team channel, one notice to the Approvers channel. As Teams integrates with D365, it did not seem much of a hop to highlight the Opportunity approval.

The only issue is that approvals don’t work in team channels, only when sent to a user. Until this is resolved by MS, you are limited to sending the approval to an individual in Teams.

Sending the Approval

The key bits of this action is ensuring you have the Approvers tenant (can you post approvals across tenants?), the respond link, the approval ID and the creation time populated with the data coming from the approval. The same goes for the Reject action.

That’s it, the new approval is sent. The full JSON I have produced is here

Waiting for the Approval

As the approval is configured that anyone could approve or reject, the next action is to wait for that approval to happen. Approvals can happen upto 30 days, which is another issue, but as this is to speed up the approval process, let’s not worry about that.

If the outcome is approved, then the field Complete Internal Review is checked and a note is created, linked to the Opportunity logging who approved it.

This is in a loop, as, in theory, there could be more than one approver on an approval, if you use the Approval setting that forces everyone to approve something.

The Regarding / Regarding type, highlighted above, need to be populated as you get orphan records and can spend 20 minutes wondering what is wrong (not me obviously)

On the Reject side of the condition, the Opportunity is put back to the state it was in before the flow started, namely Develop Proposal is reset. This triggers our Flow again, but as long as the first condition is met, it won’t go any further. A note is also added, to highlight who rejected it and why.

Adaptive Cards – Improved Approvals (Part 1)

Adaptive cards are relatively new to the stack of tools available to PowerPlatform users, emerging from Message Cards. They are a great way of interacting with users who are not a typical D365 user, those on the periphery who are interested in the data but not the detail.

Objectives

  • The Scenario (This Part)
  • Preventing progress of an Opportunity (This Part)
  • Using Flow to create a basic Approval (This Part)
  • Creating an Adaptive Card (This Part)
  • Using Flow to create the Approval
  • Updating the Opportunity

The Scenario

Big Energy is going well, they are now involved in some big deals for big enterprises which need a lot of time to land. The proposals that are generated are complicated, and they struggled with some dubious sales people reducing the margins just to get the deals and this is just bad for business.

An approval process needs to be implemented, where one or more of a designated group of individuals per territory review the opportunity and decide if the margins are appropriate.

Unfortunately, the approvers tend to be very busy senior directors, who use D365 sporadically, if at all, and Big Energy need to allow them to approve the opportunities where ever they are using Outlook or Teams as the preferred option.

Tweaking the standard Sales process

Microsoft provides a Business Process Flow for Opportunity management, and in our scenario, only the approvers should be able to check the boolean Complete Internal Review. This is part of the standard Propose stage of the BPF.

 

To “lock” (I know it isn’t foolproof, what is?) the progress on Propose, the Complete Internal Review is subject to a simple business rule, if the opportunity is at any stage, lock the field.

Now, no one can edit that field, if that field is made mandatory to progress the bpf stage, no one can progress the stage past propose.

Territories are often used in Sales to group accounts or account managers and in our scenario, there is a set list of approvers for a territory. I have added a new many – to – many relationship for this, Approvers and ensured it is listed in the user form as one of the relationships

Using Flow to create an Approval

In the standard Propose stage, there is another boolean that is of interest, Develop Proposal. The Flow is triggered when this value is changed. A simple CDS update trigger is the starting point.

The next stage is to confirm that this trigger is coming with the correct record state, the record has been marked with Develop Proposal, but the other field, Complete Internal Review is still empty.

The flow to create the adapted card is fairly intense, well, from my experience, as you will see, so for now, create an Approval using enough details to get the default experience that can be built on.

In Details, there is a lot you can do, using markdown but this is not as comprehensive as the formating you get from adaptive cards.

When this flow is run, you will get an email to the assigned to with a simple, standard approval, which is in itself, an adaptive card, but it is fairly plain.

Using the Flow history, this action also shows the adaptive card that was built

Copying this value into the Adaptive card designer JSON section gives the format for a basic design, which can be augmented to show some proper information

Building an Adaptive card

Adaptive Cards are a means to interact with your users via email, teams or any other app that handles the rendering of them. They have actions, allow images to be presented and can format text in a markup that imitates a comprehensive website. They are supported in Outlook mobile apps as well as O365, either using the main app or online.

They work by rendering a JSON object, which can be formatted to match the host application (the dark black Teams theme for example renders it very differently, but the core actions are still there.

Microsoft has built a superb tool to manage Adaptive cards, the new version, at adaptivecards.io/designer. This site has lots of examples to get you started, the Expense Report is a good starting point from a design point of view, but the standard approval card forms the base for the card. There are bits in it that you need to incorporate into your card to allow the approval to work.

The parts in the data section are the essential bits that, in our adopted JSON need to be duplicated or populated by Flow to allow our card to act as an approval.

My card is a bit different than the standard, displaying key parts of the Opportunity and the associated product lines.

As you can see, there is a lot more information on what is happening on the opportunity, probably enough for a sales manager to make a decision in most cases. Included in the card are links to the Account and Opportunity if further review is needed.

I would recommend starting from a sample and building your content, with dummy data, so you can get the layout correct.

Each of the buttons are also cards on their own, allowing a comment to be made before the approval is approved or rejected.

These have been copied from the standard adaptive card produced by the Flow approval so that the submitted approval works like a standard approval.

Some considerations and limitations

I first started trying to reproduce the Expense Approval card in full from the samples

This has a great use of hidden / visible sections of the expense lines which could give you a lot of real estate for Opportunity lines. Unfortunately, these are not rendered in Teams.

Also, I thought I would be able to use HTTP trigger, but again, any button with an HTTP trigger is ignored in teams, you are only allowed to create actions for opening URLs, submitting, hiding parts and showing a secondary card.

Below the main part of the designer is the JSON, which is created by any changes you make above but also can be edited and reflected in the visualiser. The snippet below is taken from the standard card, which contains all the bits that need duplicating to ensure the new, improved approval works correctly.

   "actions": [
        {
            "type": "Action.ShowCard",
            "title": "Approve",
            "card": {
                "type": "AdaptiveCard",
                "body": [
                    {
                        "type": "TextBlock",
                        "text": "Comments",
                        "wrap": true
                    },
                    {
                        "type": "Input.Text",
                        "id": "comments",
                        "placeholder": "Enter comments",
                        "maxLength": 1000,
                        "isMultiline": true
                    }
                ],
                "actions": [
                    {
                        "type": "Action.Submit",
                        "title": "Submit",
                        "data": {
                            "Environment": "Default-2821cf92-86ad-4c7b-ba9a-5c79a70d4a21",
                            "ApprovalTitle": "Appoval required for Opportunity",
                            "ApprovalLink": "https://flow.microsoft.com/manage/environments/Default-2821cf92-86ad-4c7b-ba9a-5c79a70d4a21/approvals/received/6cce94f6-603c-40e7-adb6-8b20c75f724f",
                            "ApprovalName": "6cce94f6-603c-40e7-adb6-8b20c75f724f",
                            "ItemLink": "https://.crm.dynamics.com/main.aspx?newWindow=true&amp;pagetype=entityrecord&amp;etn=opportunity&amp;id=b7c47c42-a290-e611-80e3-c4346bacba3c",
                            "ItemLinkDescription": "Opportunity for  7-Eleven and Udaside label - ",
                            "OnBehalfOfNotice": "Requested by Carl Cookson <CarlC@CRM.onmicrosoft.com&gt;",
                            "CreatorName": "Carl Cookson",
                            "CreatorEmail": "CarlC@CRM.onmicrosoft.com",
                            "CreationTime": "\"2019-07-03T14:30:02Z\"",
                            "MessageTitle": "Appoval required for Opportunity",
                            "Options": [
                                "Approve",
                                "Reject"
                            ],
                            "SelectedOption": "Approve",
                            "ActionType": 1
                        }
                    }
                ],
                "$schema": "http://adaptivecards.io/schemas/adaptive-card.json"
            }
        },

User Admin PowerApp (Part 3)

The user has been notified that they have a new report, the manager has updated some fields on their user record, they now want to add some roles and teams to allow them to start work.

Objectives

  • The Scenario (Part 1)
  • Notifying the manager of a new Employee (Part 1)
  • PowerApp to display and update User Data (Part 2)
  • Update Roles and Teams (This Part)

Listing Roles that user hasn’t got

The grid at the bottom of the user screen shows the roles that the user has. If we want to add a role to that list, firstly lets display the roles available to them.

On the form visible event, there is a lot going on. I have already discussed defining the XML to get the list of roles and teams for the user. In this code I also create XML to retrieve the roles the user hasn’t got. Again, FetchXML Builder is your friend.

A looping concatenation isn’t available (well not this week) in PowerApps, so you have to get around this by populating a table with one or more strings then concatenating the output of the table.Using the Set command, create a variable and populate it with a string like below using the ForAll command to repeat the string build for each of the roles the user has got.

    Set(
        rolesTable,
        ( ForAll(
            secGroups,
            "<condition attribute=""roleid"" operator=""neq"" value='" &amp; 'role2.roleid' &amp; "' /&gt;"
        ))
    );

This, when you then use the ConCat command creates one string with these as many parts in in as roles the user has.

<condition attribute="roleid" operator="neq" value='0699ab9b-984c-4896-9c8a-38352fdc3c93' /&gt;
<condition attribute="roleid" operator="neq" value='d77e2e20-4eac-4fbf-b1a8-5bec6f853ebf' /&gt;

The final string combines this string with the other bits that I am interested in, selecting the attributes that is required as well as filtering the roles return to only those in the users BU. If this condition is not put in, all the roles that are associated with all the BUs will be returned. In D365, a role is duplicated for each BU you create, and you need to associate the user with the right role for their BU.

    Set(
        rolesNotGot,
        "<fetch top=""50"" &gt;
                        <entity name=""role"" &gt;
                            <attribute name=""name"" /&gt;
                            <attribute name=""roleid"" /&gt;
                            <filter type=""and"" &gt;" &amp; Concat(
            rolesTable,
            Value
        ) &amp; "</filter&gt;
                            <filter&gt;
                            <condition attribute=""businessunitid"" operator=""eq"" value='" &amp; MyReports.Selected.'Business Unit'.businessunitid &amp; "'/&gt;</filter&gt;
                        </entity&gt;
                    </fetch&gt;"
    )

This produces an XML that looks like the below.

<fetch top="50" &gt;
                        <entity name="role" &gt;
                            <attribute name="name" /&gt;
                            <attribute name="roleid" /&gt;
                            <filter type="and" &gt;<condition attribute="roleid" operator="neq" value='0699ab9b-984c-4896-9c8a-38352fdc3c93' /&gt;<condition attribute="roleid" operator="neq" value='d77e2e20-4eac-4fbf-b1a8-5bec6f853ebf' /&gt;</filter&gt;
                            <filter&gt;
                            <condition attribute="businessunitid" operator="eq" value='ec0e9d51-0e91-e911-a822-000d3a34e879'/&gt;</filter&gt;
                        </entity&gt;
                    </fetch&gt;

On selecting the plus button at the top of the grid for Roles, another collection is populated using the custom connector, passing the XML to the Roles definition.

ClearCollect(
    groupsandteams,D365FlowConnector.GetAllRoles(rolesNotGot).value
);

Add Role screen

The final screen is a simple gallery, with the name of the role displayed.

All the work is done on the + icon. This calls another custom connector, adding the role to the user. Postman is your friend here. My Postman to insert a role for a user is shown below

Basically, this is posting a body containing a systemuserroles_asscociation to the the webapi, which creates a new one. The user reference is in the URL and the roleId is in the body.

Now, I confess I am not the most offay on the best way to use the Web API. This is one of the reasons I am writing a blog, learning. See that little highlighted $ref? I spent hours trying to get this working and only found that this is required at the end. If you omit the $ref, it returns a positive result. Same if you try to add a role that belongs to a BU that the user doesn’t. Someone tell me where this is logged so next time I can do some troubleshooting rather than trial & error?

The D365 document does mention it, but it is simply missed and it is part of the OData standard for addressing references between entities. As I say, learn everyday.

To convert this to a custom connector definition the GUID of the user account in the URL needs to be replaced with a parameter, note the curly brackets and looks like the below

In PowerApps, after refreshing the connector, there is a new method on our connector. On click of the + in the Add role screen, this is called.

D365FlowConnector.AddRole(
    MyReports.Selected.User,
    "https://urlofyourcrm.crm.dynamics.com",
    "/api/data/v9.0/roles(" & roleid & ")"
)

Even though the second parameter has been hidden and given a default value in the connector, it still prompts to be entered. Not sure why.

The Teams addition is the same logic, but a different connecting entity. The postman for this is below

Deleting a Role

To complete the circle, a manager should be able to delete a role or team that the user belongs to. This is again done via a custom connector, the postman is below

As with the create definition, the 2 guids in the url need to be swapped out so the definition is created like this

Once refreshed in PowerApps, this is connected to the remove button on the Roles grid.

D365FlowConnector.DeleteRole(
    MyReports.Selected.User,
    'role2.roleid'
);

After this call, the grids are refreshed to show the action has taken effect.

Teams works in the same way, using the url like this

And that is it! This app works well to bridge the gap for onboarding users in a busy enterprise, especially as a PowerApp, it is available to users on the go.

Custom Connectors really bring the availability of all the standard Customer Engagement functionality to the “citizen developer”. As enterprise level organisations get on board with PowerApps, small apps that make the employee experience easier will become necessary. This is only an example of how this could be done. Reach out if you would like me to share the app with you.

User Admin PowerApp (Part 2)

So, after notifying the user that there is a new employee in their team, the manager needs to be able to update the data.

Objectives

  • The Scenario (Part 1)
  • Notifying the manager of a new Employee (Part 1)
  • PowerApp to display and update User Data (This Part)
  • Update Roles and Teams

My Reports

I am not going to go through how to create a PowerApp, there are numerous blogs and pages that step you through this, may I recommend the Microsoft page with it all on?

My PowerApp is pretty straight forward at the start. The first screen is to retrieve all the users that the current logged in user manages, connected to a User data source in the CDS.

Start with a new List form, connect it to your CDS User data set and select a few fields. With the List form, there are several parts that you need to configure, to make the buttons at the top of the screen to work correctly.

Firstly, the Items property needs to be configured to correctly display and filter the data according to the search the user has selected and the sort.

SortByColumns(
    Search(
        Filter(
            Users,
            Manager.internalemailaddress = User().Email
        ),
        TextSearchBox2.Text,
        "fullname"
    ),
    "fullname",
    If(
        SortDescending,
        SortOrder.Descending,
        SortOrder.Ascending
    )
)

The core to this is the Filter, where the Users (the data source) is filtered to only show those where the email address of the manager of the user is the current users email address.

Filter(Users,  Manager.internalemailaddress = User().Email)

Wrapped around this is functionality to sort and search this grid.

BTW, the picture is the ‘Entity Image’ field in the data set, the picture that a user can upload into D365 against their user account.

User Details

The next screen is a standard edit form, navigated to via the little arrow with this logic

Select(Parent);
Navigate(
    'Employee Edit',
    ScreenTransition.CoverRight
)

The form again connects to the User data set, with the item in the edit portion being the significant bit

I have just selected some random fields to display here, where I think it would be appropriate to manage. The Business Unit is a key field for a user, and is a selection and needs a little help to display the right data and update the record appropriately.

Start by adding the field. If you select the datacardvalue control (the drop down itself in the form), notice that a couple of things are wrong.

Business Unit is a parent of User, a relationship Lookup, and PowerApps has decided (not sure why, the default for lookups surely should be the name field, which is how D365 works) to select the Address1_City field to display to the user when they are selecting. Change this to “name”. Change the search as well to match.

Now, the form will display the BU of the user as well as allowing an update of this field.

Displaying Security Roles

At the bottom of the edit form are 2 grids to display the users security roles and their teams. Both operate in the same way.

To get at the roles for a user, you have to use a bit of FetchXML. There is a hidden link table, or a collection value navigation, systemuserroles_association, which links a role with a user.

If you are using any sort of FetchXML, FetchXML Builder (FXB) by Jonas Rapp is essential. It allows a developer to define the exact fields, the connections and filters to form a query for data by looking at the data model. Start with a blank query, select the systemuser entity.

In our scenario, the query should return the roles for one user, the one that has been selected by the manager. Add a filter, then a condition, select the systemuserroles entity, and the systemuserid attribute (field). The operator is equals and the value is the GUID of the user account.

To test the query, paste in the GUID of the user. Another great tool is the Chrome Addon, Level Up for Dynamics by Natraj Yegnaraman allows you to quickly get to the GUID of any record, as well as some other useful tools (please don’t tell an end user about God Mode). The query should retrieve all the fields on the User entity for the chosen user.

Next, select a link-entity, namely the many to many systemuserroles.systemuserid -> systemuserid.

Executing this query will result in a lot of fields been returned, most of no interest to our query, so add some attributes, which limits the fields returned. Only interested in the roleId and the name.

Hit execute this time and this will display the roles that are associated with the user selected.

This is all well and good, but how is this data getting into the PowerApp? A custom connector of course. I have used custom connectors a lot to get at the bits the standard CDS connector can’t in my other posts. I won’t discuss how to create the connector, my post on LUIS covers that. I expand on it fixing the bug on creating attachments in D365.

Creating the Connector Definition

Again, Postman is your friend here. Each of the main entities will accept FetchXML as a parameter, as in this is what is created in Postman

{{WebAPIUrl}}/systemusers?fetchXml=<fetch mapping="logical" count="50" version="1.0"&gt;

<entity name="systemuser"&gt;

<attribute name="fullname" /&gt;<filter type='and'&gt;   
              <condition attribute='systemuserid' operator='eq' value='EBD3707B-6C88-E911-A83E-000D3A323D10' /&gt;   
           </filter&gt;
<link-entity name="systemuserroles" from="systemuserid" to="systemuserid"&gt;

<link-entity name="role" from="roleid" to="roleid"&gt;

<attribute name="name"/&gt;<attribute name="roleid"/&gt;

</link-entity&gt;

</link-entity&gt;

</entity&gt;

</fetch&gt;

Paste this into a new connector action and PowerApps will convert the fetchXML as a new parameter for the action.

Test this action with a copy and paste of the XML that you generated in FetchXML Builder and the action will return a bit of XML itself

Using the Connector in the app

Add the custom connector as a Data source in the App.

To call the connector, on the OnVisible of the Employee Edit form, generate the XML first. Set a variable to the XML generated from FXB, substituing the current selected user id as appropriate

    Set(
        secString,
        "<fetch mapping=""logical"" count=""50"" version=""1.0""&gt;
            <entity name=""systemuser""&gt;
            <attribute name=""fullname"" /&gt;
            <filter type='and'&gt;
                <condition attribute='systemuserid' operator='eq' value='" &amp; MyReports.Selected.User &amp; "' /&gt;
            </filter&gt;
            <link-entity name=""systemuserroles"" from=""systemuserid"" to=""systemuserid""&gt;
                <link-entity name=""role"" from=""roleid"" to=""roleid""&gt;
                <attribute name=""name""/&gt;
                <attribute name=""roleid""/&gt;
                </link-entity&gt;
            </link-entity&gt;
            </entity&gt;
</fetch&gt;"
    );

MyReports is the list on the first screen, Selected is the user that the manager has selected. PowerApps converts the User to the GUID of the user (handy). Be careful with the quotes here, I found when pasting, curly versions of the straight quotes came in ( “ rather than “) which took me a while to resolve. Also, ensure all the double quotes are present. FXB provides single quotes and hence you need to double up for PowerApps to infer an actual quote rather than the end of the string.

Update: Thanks to Jonas Rapp for reaching out on Twitter to put me right. In FXB there is a setting to use single quotes rather than double in the rendered XML

This then allows a straight copy / paste into your PowerApp

Set(
        secString,"<fetch top='50' &gt;
  <entity name='systemuser' &gt;
    <filter type='and' &gt;
      <condition attribute='systemuserid' operator='eq' value='" &amp; MyReports.Selected.User &amp; "' /&gt;
    </filter&gt;
    <link-entity name='systemuserroles' from='systemuserid' to='systemuserid' intersect='true' &gt;
      <link-entity name='role' from='roleid' to='roleid' &gt;
        <attribute name='name' /&gt;
        <attribute name='roleid' /&gt;
      </link-entity&gt;
    </link-entity&gt;
  </entity&gt;
</fetch&gt;"

After the string is created, pass this to the custom connector, inserting the return into a collection. This is the collection that is shown in the grid.

    // Get the groups for the selected user
ClearCollect(
        secGroups,
        D365FlowConnector.GetSecGroups(secString).value
    );

Put a gallery on the form, and use this collection as its items list, and sorted by the role just for usability.

The Teams grid is populated very much the same. The connector is based around the teams entity rather than the system user (in hindsight, I should have based the roles one of roles, rather than user, but achieves the same result, and it is all about learning isn’t it?). The XML looks like this.

Set(
        teamstring,
        "<fetch top=""50"" &gt;
  <entity name=""team"" &gt;
    <attribute name=""name"" /&gt;
    <filter&gt;
      <condition attribute=""isdefault"" operator=""eq"" value=""0"" /&gt;
      <condition attribute=""systemmanaged"" operator=""eq"" value=""0"" /&gt;
    </filter&gt;

    <link-entity name=""teammembership"" from=""teamid"" to=""teamid"" intersect=""true"" &gt;
      <filter type=""and"" &gt;
        <condition attribute=""systemuserid"" operator=""eq"" value=""" &amp; MyReports.Selected.User &amp; """ /&gt;
      </filter&gt;
    </link-entity&gt;
  </entity&gt;
</fetch&gt;"
    );

teammembership is the connecting table.

So, that is enough for now, the app displays users that report to me, I can see their teams and roles. Next post will show how the teams and roles are updated.

User Admin PowerApp (Part 1)

Sorry it has been a while since my last blog post, this scenario has taken a while to get it to the state where I was happy to show it off. Mainly due to my own lack of understanding of the intricacies of the D365 API, but also been busy external to the blog, you know real life.

Objectives

  • The Scenario (This part)
  • Notifying the manager of a new Employee (This Part)
  • PowerApp to display and update User Data
  • Update Roles and Teams

The Scenario

Big Energy Co is going from strength to strength, presumably because of the innovative solutions using LUIS , Alexa and IFTTT.

The HR department is ramping up recruitment and new teams are being shaped to support all the growth.

One of the criticisms from the managers is that it takes a while for the IT / D365 administrators to get users in the correct teams and security roles so they can be effective in D365.

A clever chap in the management team suggested that they be given an app that would allow a manager to update the roles and teams (and other relevant parts of a user) without resorting to logging into D365 administration. Something they can use wherever they have WiFi or a data connection.

It would also be good to get a notification when they have a new employee, or someone is added to their reports.

The Flow

This flow is quite simple, trigger an email when a user has the Manager field (parentsystemuserid) field updated. O365 will create the user for us (assuming you are in the cloud) and an administrator will still have to update the users manager.

Here, the attribute I am interested on is parentsystemuserid.

Next, just check to see if the manager is actually populated. In a lot of businesses, removing their manager is part of the process on off-boarding an employee, to tidy up selection lists etc.

Then, get the manager user record from D365 so that the email can be sent to it.

Told you it was simple. I am sure that this can have more logic – Do we need an approval step before assigning this user? Do we have to wait for HR to do some work and only activate the user once all the checks are done?

Next, I’ll step through the PowerApps set up to retrieve data from my reports.