MS Certifications – Sharing my Mind Maps

Over the last 15 months, I have completed 5 Microsoft certifications and have used various sources on the internet to get me through.

My method of remembering facts relies on completing a mind map. This way, I note down the key points (key to me anyway) for a subject and link them together, serving as a quick refresher on the day but forces me to fill out my knowledge as I research the content and develop my understanding.

Please don’t think that memorising these maps will be all you need to do to pass any of the exams. I share them to assist you in getting that certificate and maybe trigger something that you don’t understand so you can research that area more effectively. I also don’t guarantee that they are all accurate. The vagaries of time and my ineptitude will ensure they are not.

It goes without saying that practice is the best way of ensuring you have the right understanding of the product, but I have found that there are always areas where you have not come across, even if you have been working on an application for years. Microsoft never stops developing.

I will keep this page up to date as I complete more exams with the mind map and any resources I used, with the exam I took most recently at the top.

All the mind maps can be found here. I use Freeplane as my mind mapping tool of choice, because it is open source, free for unlimited maps, allows tweaking of styles etc and is not trying to sell you it’s bigger brother. There are others out there which offer increased functionality, but this works for me. You can import Mindmap files into most other services.

MB-210 – Sales

Mind Map

Sales is my bread and butter, with lots of experience in what the life-cycle is etc, but the new parts, Sales Insights was new.

Again, I used Neil’s blog. Great insight, though I would also get an understanding of the minimum requirements for some services, which I wasn’t ready for.

The scenarios are full on too, though the advise is to read it through before tackling a question and re-read the sections before giving your answer

MB-240 – Field Service

Mind Map

Field service was a stretch goal for me, as I have never used it in anger on a client site. It is a relatively new product in the suite, but one that is getting a lot of traction in the D365 community.

I went into this exam thinking I was not prepared, but managed a decent pass with my learning. The exam itself is quite short, but did stretch my understanding.

My first resource, as always was Neil Parkhurst. He has a great series of posts on this exams predecessor, MB2-718 Whilst being slightly dated, it is still relevant.

Further, I went through the Microsoft Learn topics on Field Service here

MB-200 – Customer Engagement Core

As I have been working with D365 a while, I was very lazy when going into this one. I just assumed I knew it all, brushed up on the new stuff and went for it. Thankfully, this didn’t bite me in the ass.

I didn’t do a map for this one, but I reviewed the map for MB2-716 here.

I also review Neil’s posts on this one and @NZ365Guy videos on the subject on the OpenEdx course which are a great, particularly for the new stuff in the application, like Flows.

AZ-203 – Developing Solutions for Azure

Mind Map

I consider myself to be a lapsed developer. I love coding, but my career has taken myself into more of an advisor and designer than a “coder”. This exam was proof to myself that I could still run with the cool kids and also exposed me to a lot of the azure stack that I did not know.

As I was preparing for this exam back in January, to get an Azure Developer Associate certification, it swapped from 2 exams to 1, AZ-200 being the first, that is why they references to AZ-200 linger.

For this exam I was indebted to the course by Pluralsight which I am thankful that I had access to.

From a D365 developer point of view, this was a tough one. It was a step above and beyond what I expected. There is so much content, each area to a decent understanding level it really taxed me. This is the first exam to push the time I had available to complete it, with a large number of questions and a lot of deliberation.

MB2-718 – D365 for Customer Service

Mind Map

This was the last of the older exams I took before Microsoft revamped the exams. Neil Parkhurst has an excellent blog on this one, which was my source. I have been doing service for quite a while, but the intricacies around SLAs and Queues were something that I had to learn. It also contained Unified Service Desk and Voice of the Customer, both subjects I had not come across.

MB2-716 – Customisation & Configuration

Mind Map

Another old exam, and something I take pride in as it is my bread and butter. This was another re-cap of my understanding, particularly around the bits where I would just Google when I came across it in my day job. Neil Parkhurst again provided the detail which saw me through. Bits on auditing and configuring email etc. were items that I knew the fundamentals, but Microsoft has a knack of slightly tweaking the wording giving a separate answer, so it is vital you know your stuff.

MB2-715 – D365 Online Deployment

Mind Map

I am fortunate to have come to D365 after online was the chosen deployment method. I have not got nightmares like some older community members around installing and configuring on-premise solutions. This is another exam that is part of my day-to-day role so it was just a matter of brushing up on where I did not have enough knowledge.

Neil Parkhurst (who doesn’t owe Neil a beer?) has it covered again. There is a lot you take for granted here, that you need to get to grips with, such as licensing, what you need to do in the Office Admin portal vs D365 admin, Email configuration and integration with other apps like SharePoint.

Cloning Flows: Location triggers for everyone

Sometimes ideas don’t work out. This is one of these times. But the reason I blog is to learn, expand my knowledge of the PowerPlatform, expand my knowledge of components outside of it. So, I figured I would blog about my failure, learning is learning. As I started testing the flow again, moving environments etc, it started working. I guess this is down to the location trigger being a work in progress. Moral of the story: If it is broke last month, try again this month.

Back in July, I started working on this scenario, but couldn’t get it working. I noticed @Flow_Joe_ & @JonJLevesque did a video walkthrough of using the Geofence trigger to send out a summary of the customer when a sales person enters an area, which reminded me of my failure, hence why I have written it up. While from Joe & Jon’s video shows us how easy it is to create a flow, for Salespeople in general, I think this is too far. You can not expect a Salesperson to have any interest in creating flows to do this, you can expect them to click a button on a form within their D365 application.

Objectives

  • The Scenario
  • Creating the Flow button
  • Cloning the Flow
  • Outcome

The Scenario

Numerous times when I have been a customer, a salesperson would come to us not knowing that we have several major cases logged with them against their product. This is mainly down to lazy sales people (I know, they don’t exist), but it would be awesome for the salesperson to get a summary of the account when they get in the door of a customer. The number of support cases, a list of the open opportunities and orders, any complaints that have been logged. All of this information is available to the salesperson via the D365 mobile app, but it would be good to ensure that they get this information and are less likely to get caught out by a customer venting at them for 5 critical bugs that have been sat around for a month.

The Solution

Flow has a new trigger, still in preview, Location, which is triggered via the Flow application when an user enters or exists an area. This is perfect for our scenario, stick a GeoFence around a customers location, when the user enters the area, it gets triggered. Look up the Customer, format an email and send it to the user.

Flow is user friendly, a low code solution, but you can not expect a salesperson to create a flow for each account they want to create this trigger for. What can be done, is put a button on a form, automatically create a Flow for the user against the account they have selected which would then be triggered when the user enters the location.

There are 2 separate series of flows that are required, firstly to start with an action from the user on the account record, which triggers cloning of a template.

The second series is the clone of the template, which triggers sending the salesperson the relevant information when they enter the customers property.

Creating a Flow Button

Starting with a CDS “When a record is selected” trigger, configure it to be used when an account is selected.

The next step is to retrieve who is running this flow. As mentioned, it will publish this button on a Account form, so it is essential to know who is running this, so an email can be sent to them. The account information and who the user is is sent as the body to a HTTP Post trigger, which is the next flow in the chain.

An HTTP trigger is used because the next Flow requires enhanced access. An admin user needs to clone a Flow, which you would not want a normal user to be able to do. The admin is used as well to ensure any runs that happen are legitimate. The admin or sys account shouldn’t belong to someone who could have Flow in their pocket.

To have the URL to send to, the next Flow needs to be created first, but just to show where this button appears within the D365 interface. The first time we run it, there are few confirmations that you need to do, finally you can run the flow.

Cloning the Flow

This flow clones an existing template, tweak it slightly and gets it up and running as the user.

Starting with an HTTP Trigger, I use a sample payload to build the schema.

Next is retrieving the account. As the account id is passed in from the calling Flow, a simple Get Record is used.

Next, configure the name of the Flow that will be created, making it unique for the user by adding their email address in. A flow definition string is also initialised for later

In this Flow, the user that called it from the button is needed, so it retrieves the profile using the Office 365 Users action.

Next, retrieve my template flow. Flow has several actions around management of Flows, which are incredibly useful to a Flow administrator. The template flow is a simple flow which has a location trigger and a call to a http trigger to call a secondary flow. I will discuss later the detail about this.

The next couple of actions try to determine if a flow with the FlowName defined already exists, firstly by getting a list of all my flows (as an admin) then getting a list of Flows in the organisation, then filtering it with the flowname that was defined in the initial steps

If there is a flow already, just stop. If not, carry on & clone the template flow.

The Template

The Log Template is a very easy, small location trigger with an HTTP call action. The HTTP call passes in the user’s location and the account id and the user who started the process. Both email and account will be swapped out as part of the clone.

The trigger region is essential for any location trigger. It triggers this one of the Microsoft campus in Redmond. Someday I will be fortunate to go to the motherland. I chose this as it is not likely that the user would have them as a client, but it doesn’t really matter where you chose, as what you need is the latitude and longitude from it so you can replace it when you clone the flow.

If you click on the peek code button against the trigger, it shows a JSON representation of the trigger. The latitude and longitude are that of the Microsoft office and this is the bit I need to replace

Cloning the Flow (part 2)

All a Flow is a JSON file. Obviously, how it is rendered and how the hooks and actions work are the power, but the definition is a JSON file. Using this knowledge, we can create a new version of the template, with a location specific to the account.

The template in all it’s glory is below. Just using simple find / replace, we tweak it to the specific location, account and user.

{
  "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "$authentication": {
      "defaultValue": {},
      "type": "SecureObject"
    }
  },
  "triggers": {
    "manual": {
      "type": "Request",
      "kind": "Geofence",
      "inputs": {
        "parameters": {
          "serializedGeofence": {
            "type": "Circle",
            "latitude": 47.64343469631714,
            "longitude": -122.14205669389771,
            "radius": 35
          }
        }
      }
    }
  },
  "actions": {
    "HTTP": {
      "runAfter": {
        "Initialize_Email_Variable": [
          "Succeeded"
        ]
      },
      "type": "Http",
      "inputs": {
        "method": "POST",
        "uri": "https://prod-68.westeurope.logic.azure.com:443/workflows/<GUID>/triggers/manual/paths/invoke?api-version=2016-06-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=<SIG>-JQQvYT0",
        "body": {
          "lat": "@{triggerBody()?['currentLatitude']}",
          "long": "@{triggerBody()?['currentLongitude']}",
          "user": "@{variables('Email')}",
          "account": "@{variables('accountId')}"
        }
      }
    },
    "Initialize_Account_Variable": {
      "runAfter": {},
      "type": "InitializeVariable",
      "inputs": {
        "variables": [
          {
            "name": "accountId",
            "type": "String",
            "value": "<accountId>"
          }
        ]
      }
    },
    "Initialize_Email_Variable": {
      "runAfter": {
        "Initialize_Account_Variable": [
          "Succeeded"
        ]
      },
      "type": "InitializeVariable",
      "inputs": {
        "variables": [
          {
            "name": "Email",
            "type": "String",
            "value": "<email>"
          }
        ]
      }
    }
  },
  "outputs": {}
}

Back on the clone flow, the next step is to convert the template to a string. This makes it easier to replace the latitude, longitude etc. with the ones we want.

On the account OOTB record there is a latitude and longitude. This data is not normally populated, but it is used by Field Service and other applications. I used Field Service to populate it using the Geo Code button.

As you can see from the above, Field service populates both latitude and longitude to 5 decimal places. This is common precision when you use any mapping software such as Google. so I am not sure why if you do the same by the Flow trigger you get precision to 15 dp for latitude and 17 for longitude.

The next 2 steps are because of me trying to get the flow to work. One of my thoughts was that the flow was expecting the all 15 of the decimal places to be populated, so these steps pad out the number you have against the account with additional numbers.

The expression is the same for both

concat(string(body('Get_Account')?['address1_latitude']),'111111')

The next step replaces the newly calculated values for latitude and longitude in the JSON definition

replace(replace( variables('flowdefstring'),'47.643434696317136',outputs('Replace_Lat')),'-122.14205669389771',outputs('Replace_Long'))

The accountid is also replaced. This is used in the cloned flow to define which account the user selected. The trigger only gives you the user’s current location, not the centre of the circle you configured. You could use these values & find the account, with difficulty, unless there is something I am missing. I prefer to add a variable in the clone, which is the account id.

replace(outputs('Replace_Lat_Long'),'<accountId>',triggerBody()?['Account'])

The same with the email to send to, it should be the user who triggers the geofence, but seems to be the person who is the admin. As I clone the Flow with an admin account then add the user as an admin, it runs under the admin account.

There is enough info now to create this flow. Using the Create Flow action, the new flow is created and up and running.

I use a JSON expression to convert the string I have used to find / replace the latitude, longitude etc. to ensure the Flow is created with JSON.

json(variables('flowdefstring'))

The final step is to add a Flow owner. As the sales person who triggered the flow is who it should trigger on, make them the owner, so it should run under their context.

Outcome V1

Ignore this bit if you want to avoid the author moaning about stuff that doesn’t work.

If I run the whole flow, I do generate a new Flow.

Going into what was generated, using peek code again, you can see that the Microsoft location has been replaced with the Avanade office

The trigger is active, but this is where it stops. I can not get this to trigger to fire. Changing the location to my home, going for a walk, coming back doesn’t trigger it.

If I don’t put in the padding for the latitude and longitude, it doesnt trigger.

If I clone from my location, not changing the latitude and longitude, still the trigger doesn’t fire.

If I configure a new trigger from scratch, that works.

Everything about the trigger look the same when you get it in code view, but there must be something different.

This is where I started reaching out, I tweeted about it to the gods of flow and asked in the Flow forum where I did get a response, basically saying the same, and that the location trigger is in preview.

So, if you have got this far, how do I fix it?

Outcome V2

Like I said at the outset, this didn’t work for me. Frustration set in, and I forgot the idea. But, as I was putting together this blog post, re-deploying the components as my demo system had expired, it worked!

So, moving on, we need to sent an email to the user with the playbook for the account. I want to list the last 5 critical cases, last 5 open opportunities, last 5 notes and any description the user has put in.

It triggers an HTTP request, the schema defined by a sample payload, but contains who triggered the workflow and which account.

Then, a great time for a parallel branch. The Flow retrieves the cases, notes and opportunities in a parallel branch.

Each branch does a similar thing, looking at the Notes branch, firstly retrieve the records with a CDS List Records action, using an OData filter and order by, return the top 5 only.

Next, put this in an HTML table, selecting the output from the Get Notes action. I select Advanced option, then Custom columns, this way I can define the order and which columns I want to display.

The final step is to send an email

Obviously, this can be customised to your business need, but my example list the cases, opportunities & notes, and reminds them to fill in a contact report.

Summary

So, the user selects a button on an account form, which allows them to receive updates about one of their customers when they enter the location of the account. Easy.

I tested this with my home address and with a different user and you can see that I get the email through. Veronica is in the US, I wasn’t up at 1am writing blogs & fixing Flows.

You can also see that Flow notifies the user that it has made them an administrator on a Flow.

This Flow starts with a Flow button on a record, making it a user-initiated process. It could be triggered off a record creation process – If the user follows an Account, create this automation for them, as long as they have opted in.

There is location tracking in the Field Service application, but that requires the Field Service mobile app and not suited to a Sales person. They just need to install the Flow app on their device and forget it is there.

AI Builder – Text AI

My blogging journey started with using LUIS, one of Microsoft’s Cognitive Services to automate case assignment. This blog goes into detail about how this all hung together, using a model defined in LUIS, calling the LUIS endpoint when a new cases are created and classifying the case, by the subject, with the result from the call.

After my summer break (sorry, but family etc comes first) I thought I would revisit this scenario, but using one of Microsoft’s shiny, new AI Builder capabilities, Text Classification AI Model.

Objectives

  • The Scenario
  • Training your Model
  • Getting Data
  • Publishing the Model
  • Using the Tags

The Scenario

In my first blog, I went through the scenario, so not wanting to repeat myself, but for the lazy who don’t want to click through…..

Big Energy is a supplier of energy products to end users. They have a call centre which handles any query form the customer. As a perceived leader in the sector, it is always wiling to use the latest technology to allow users to interact with them, which reduces the pressure on the customer support centre.

Big Energy has a mail box configured to accept customer emails about anything and, rather than have a group of 1st line support employees filtering out and categorising the emails based on the content, want to use cognitive services to improve the process of getting the email (the generated case) to the right team.

Using AI to file the case

LUIS does a great job of this, with a BA providing sample utterances for the model and training it.

Text Classification AI Model does it slightly differently. The model expects users to provide data (in the CDS) in the form of text blocks and representative tags for the data. Both need to be in the same entity in CDS.

On a standard Case record, the classification or tag is the subject field. This is a parent record of Case and the tag would be the name of the subject. As subject and case are separate entities, the Text Classification AI model will not work. A field, be it a calculated one, has to be introduced to enable the classification AI to work. Adding data to an entity from a parent entity breaks my Third Normal Form training (anyone remember that? Is it still a thing?).

I have raised this issue as a new idea on the PowerApps ideas forum, go there and give it a vote!

The new logic for our AI model is that the AI will classify the incoming case, adding a tag. This will trigger a flow, changing the subject of the linked case accordingly. This will trigger re-routing of the case like it did in the original LUIS method.

Training your AI

With any AI model, it needs training. The AI needs to separate the wheat from the chaff. Creating a model is simple in PowerApps.

Start at make.powerapps.com and select AI Builder, then Build

There are 4 options here

Binary Classification is useful to give a yes / no decision on whether data meets certain criteria. The criteria can be up to 55 fields on the same entity. For example, is a lead with a low current credit limit, high current account value, no kids but has a pink toe nail (shout out to Mark Christie) likely to get approved for a new loan?

Form processing is intended to assist users in automated scanned documents to prevent re-keying. An example would be any forms hand written as part of a sales or service process (before you convert to a PowerApp obviously).

Object detection assists in classification of items, be in types of drink, crisps or bikes, etc.

Text classification decides on a tag for a block of text, for example, a user could enter a review of a product online and text classification could understand what product it was for or whether it is a positive review.

All 4 of these have origins in the Cognitive services provided by Azure, LUIS being the big brother of Text Classification.

Ensure you are in the correct environment. Text Classification only works on data within your CDS environment, so don’t expect to reach out to your on-premise SQL server. There are ways to bring data into CDS, not in scope for this discussion.

Selecting Text Classification displays a form to give you more understanding, and it is here that you name your model

Hit Create and then Select Text. This will list all your entities in your CDS environment (in my case, a D365 demo environment).

Select the entity you want, Case for our PoC.

The interface will then list all the fields suitable for the AI model, namely anything that is a text field. I chose the description field, which is typically the email that the user enters when emailing in a case to the support department.

Hit the Select Field button and it will present you with a preview of the data in that table.

The next screen is to select your tags. This needs to be in the same table, and as already discussed, is a bit of a limitation to the AI builder. Less normalised data is more common in Canvas apps or SharePoint linked apps, but for structured data environments with relationships and normalised data this is a limitation that will hopefully be removed as Text Classification matures.

Also, option sets are not available, again another common categorisation technique. Multi-select option sets are an ideal tagging method too. Assume that this will come in time.

For my PoC, I created a new field, put it on the Case form and started filling it in for a few records.

Select the separator. If your tag field contains multiple tags, separated by a comma or semi-colon, this is where you configure it.

It also gives you a preview of what the tags the AI build would find using your chosen method. You can see in the No separator option, “printer; 3d” is one tag, rather than the assume 2 tags as displayed if semi-colon is selected. This depends on your data.

The next page displays a review for your data and the tags that the AI builder finds.

Next, select a language for the text field dependent on your data.

Once selected, train your model. This is where I started to run into problems. My initial population of tags was not enough. The training came back quickly with an error. There should be a minimum of 10 texts per tag, which I didn’t have. That would be hundreds of rows. How was I going to automate creating data to give the Text AI enough data to be a suitable demo?

Getting Data

I need thousands of records to train my model properly, thousands of records relevant to the tags I create. No online data creator seemed suitable, as it wasn’t specific enough, so how? A flow.

First I created a column in the Contact table to store a number for my contact, a unique no so I can randomise the selection of a contact.

Next, I need some data for the case description and the tags. This is already done as it is the same as the utterances and intents I used for LUIS, so I exported the LUIS configuration, put the data in an excel file & added a number to that.

Ready for the Flow

My simple flow is described below.

Ask for the number of cases to create, keep creating cases until you have reached that limit using a random contact and a random description.

This flow is triggered manually so I start with a manual trigger and also prompt for the number of cases to create,

The Subject variable is used later to define the reference for the subject we have chosen.

The default for loops is 60. I realised late on in the day that you can change that, but breaking up loops is good practice, to limit the scope of what could go wrong, so created a loop within a loop structure for my flow.

I restrict the inner loop to 50 loops maximum, which means the number of times I run this loop has to be calculated. If I want a 920 cases created, my outer loop would occur 45 times, each creating 50 cases. I would then do a final set for the rest.

The next steps will initialise some counters used in the loops. I also want to ensure that if the user wants to create less than 50 records, the outer loop doesn’t run at all.

The outer loop will run for the number of loops I have calculated. This is the loop condition. The counter increments as the last thing in the outer loop. The first thing in my outer loop is to reset the case counter. This is the counter for the 0-50. If we are in this inner loop, at least 50 cases will be created.

The first thing it does is get a random contact by using a odata filter to filter on the number field created specifically using a random number from 0-875 (875 being the highest number in that table).

Once the contact is retrieved, find a random description / tag combination. The data from the LUIS utterances is held in an Excel file on a Teams site. Again, a rand() function takes a random number up to the maximum in that file.

Because more than one subject row could be returned and the fact I don’t like apply to each inside each other, set the subject Id variable.

Ready to create a case now.

Nothing special. It also populates the tag field.

After some testing, to ensure that the case has the necessary fields entered, the flow was run for a thousand records without an issue.

Creating data this way isn’t quick, 20 mins for 1000 records, but it is easy and allows you to bring in reference data quickly. Superb for PoC environments.

Training your Model (with data)

Once this data is generated, it was time to re-train my model. It ran through with success this time.

The model is 97% sure that whatever I throw at it, it should be able to match it against the tags. There is a quick test option here too, which allows entry of a sample phrase to check your model

All ready to publish.

Publishing your Model

Publishing the model allows it be used within Flow and PowerApps.

Clicking Publish generates a child table of the entity you first chose where the predictions are stored. The documentation states the table will be TC_{model_name} but it created mine with gobbledegook.

The link on the form helpfully allows you to go straight to the entity in the new customisation interface, where you can change the label of the entity.

Also, it is useful to change some of the views, particularly the associated results view. By default it includes name & date, which is pretty useless, so add the tag and the probability.

As this is a child table of Case, it is by default visible in the case form Related navigation item. In the classic customisation interface, you can change the label of this view.

As it is published, users can use flow and the Predict action to predict the tag for a given section of text, useful if you want to do stuff before it reaches an environment.

Now that it is published, you need to allow the model to run. This means it runs every time there is a change to the text field. This is all done via Flow, so will use your flow runs. It stores the result in the new entity.

If a case is created now, it automatically creates the tag secondary record.

Using the tags

As AI builder generates a record for you with its prediction, and the data is in CDS, it is a simple Flow to utilise that. As it creates a record in the AI Tags table, update the corresponding case to change the subject accordingly.

Simple trigger when a record is created. The first action is to find the subject from the tag.

Update the case record with the subject and the tag so the AI can be retrained later.

That’s it. Replacing LUIS with a more user friendly environment is definitely a tick in the box for Microsoft. The AI in PowerApps feels like a simple, user friendly stepping stone for a lot of businesses into the AI world. Hopefully, businesses will embrace these simple models to leverage tools to shortcut processes, improving Employee and customer experiences.