IFTTT – Stopping Freezer melt

I have a problem, well my family has a problem. We can’t shut the freezer door. Regularly (once a month) our freezer would be left open just enough to defrost but not enough to sound the audible alarm, resulting in squishy food.


To combat this, I bought myself a Sonoff TH10 temperature sensor and configured this to send me an alert when it reached a certain temperature. The problem is I became blase to the alert, as it would go off as soon as someone opened the door or was not around to react. The module would only alert me once, when a warm temperature was reached, not keep reminding me or escalate to someone else when I didnt react by shouting at a teenager to close the door. So how could I ensure it alerted me less frequently and also alert the rest of the family when there was a real issue?

I know that this article is a little bit of fun with no real business benefit, but using Flow to fix a problem in my life is worth talking about. Everyone will have a little annoyance that Flow can help with, and it acts as a training exercise for us all.


As with all problems, starting with a flow chart helps. This is what I need to achieve. The Sonoff device is configured to turn on and off when a certain temperature is hit. IFTTT is triggered by Sonoff on both of these conditions.

When the hot temperature is hit, wait 30 minutes (to allow for the temp change when a door is opened and closed naturally) then check if freezer is still is over temperature. If it is still hot, send out a notification.

Then, wait 60 minutes now and check again if the temperature is too high, before repeating the loop, sending out a further notification and waiting 60 more minutes.

If we check the temperature and it is now back down to cold, stop the process.

IFTTT configuration

If you don’t know the IF This Then That, it is a great service to interact with disparate systems, a point-to-point integration to do something on a trigger. I have used it previously to log Sales users calls from an android phone and I use it to change my phone’s background with the NASA wallpaper of the day. It is a free service which plugs the gaps where Flow does not have direct connections, such as with the Sonoff device.

Sonoff can trigger IFTTT, which in turn can trigger a web service, triggering the Flow.

Log into IFTTT and link your account to eWeLink (this is the app you install to manage the Sonoff switches).

Select Create from your account

This presents you with the standard IFTTT applet creator.

Hit the big + to display all the available services, eWeLink should be on that list.

Next, you get a long list of the options eWeLink make available for you, this is where you need to select the appropriate switch, for me just a single channel switch.

Next step is to select your switch, named by you when you configure it in the eWeLink app, and whether you want this to be triggered when it is switched On or Off. For me, On means that the Freezer has gone below -20 ° C. Off means it has gone above -17 ° C.

IFTTT then needs to know what to do.

Hitting the big + again lists the action services, the one we want is a WebHook.

Next, the action requires a URL, which is from our Flow, so skip to that bit to get the URL, we aren’t posting anything to this hook, just a trigger. You could secure this a little more to pass in a key to know it is you that is triggering it.

Flow Configuration

Our flow starts with a web trigger.

I am passing a delay to the trigger, though the initial call will not have any delay.

Next, initialise a variable for checking the freezer later as well as an ID for the SharePoint Item I will create later.

If this has come from the initial IFTTT trigger, then we assume the delay is 30 minutes. You could just update a delay variable, but as you will see, I want to do something different if I am passed a delay, the second time we call it.

In the yes branch, we now wait for the default 30 minutes.

Cancelling the Flow

Now, according to our design, we need to check the temperature to see if it has dropped back below our trigger point.

Ideally, you would call the Sonoff and ask for a reading, and I am sure more expensive IoT devices allow you to do this, but I am cheap. The Sonoff and eWeLink interfaces allow you to look at the temperature on your phone, but do not broadcast the temperature out to anyone apart from the App, just the on / off trigger. Checking the temperature is not an option.

What about creating another trigger to turn off this looping? Unfortunately, this isn’t possible; there is an idea on the flow community; you can vote for here. I would also like to be proven wrong, reach out if there is a way to cancel a flow run.

I started doing this by adding a file in a Google drive location (it could be SharePoint or OneDrive but wanted to expand my use of connectors) when the temperature went down, but this was inconsistent. I ended up getting multiple emails as the file create did not happen every time. I am sure I could sort this out, but wasn’t really the point of the flow.

I ended up with creating an item in a SharePoint List when the flow was first triggered, complete the delay and check this item to see if another process has updated the same item while I was in pause mode. If that item has been updated, just stop the Flow. If it hadn’t send the email alert and trigger the flow again, to repeat the process.

First, create the item in SharePoint, with the current time (this is just for my interest, knowing how many times it triggers the freezer temp)

The Id of the item that is created is copied so we can pass it to the next iteration of the flow if needed.

After the delay, check the item that was created for a cold temperature. Using a bit of odata filtering here to only return the item that was just created and only if there is no cold time.

This Scope useage is only to allow me to copy the contents to the other branch, saving me time. The copy function is really useful, if you havent seen it, why not?

Next, if any items are returned (the item that was created was updated with a cold temperature), the title is changed to Cancelled. This is not really needed, but gave me something to do in the apply for each. The boolean is set to true, meaning Freezer is now cold and this flow can cancel itself.

Out of the loop, check the value. If the previous loop did anything, it will be true, if not false. If the freezer is now cold, just terminate gracefully. If it has not updated the temperature, let it fall through to the next action.

The final part of the original check to see if it sent a delay count is to email. In this no delay sent path, an email is just sent to me. I am using Gmail here, just because it was a new connector.

Finally, call the same flow but pass in the delay we want and the Id of the SharePoint item that was created.

What about if it gets really serious?

Telling me if the freezer has been open is fine, it usually results in a yell to a teenager to shut the f**king door. What about if I am not in?

After the first 1/2 hour delay, if the freezer is still showing hot, I need to escalate, for me means emailing my whole family. In the No branch, I delay for whatever value was sent in, set a variable to pass on to the next iteration and do the same code as in the other branch, the difference is that I include more people on the next email.

Here, I email all my family, with a nice in your face subject and also hike up the importance. Hopefully, they won’t ignore this email like they ignore me.

Triggering when it is cold

Using IFTTT the same as I have done early on, if the temperature hits -20 ° C, the second trigger is fired. This updates any items it finds in the list without a cold temperature with the trigger time

And that’s it.

In my inbox, after I have taken the temperature controller out of the freezer for a little while, I get a series of Freezer is hot messages, also sent to the rest of the family.

Once it gets back in, the notifications stop.

And in SharePoint, there is a list of the hot & cold triggers, from natural door opening, thankfully

Cloning Flows: Location triggers for everyone

Sometimes ideas don’t work out. This is one of these times. But the reason I blog is to learn, expand my knowledge of the PowerPlatform, expand my knowledge of components outside of it. So, I figured I would blog about my failure, learning is learning. As I started testing the flow again, moving environments etc, it started working. I guess this is down to the location trigger being a work in progress. Moral of the story: If it is broke last month, try again this month.

Back in July, I started working on this scenario, but couldn’t get it working. I noticed @Flow_Joe_ & @JonJLevesque did a video walkthrough of using the Geofence trigger to send out a summary of the customer when a sales person enters an area, which reminded me of my failure, hence why I have written it up. While from Joe & Jon’s video shows us how easy it is to create a flow, for Salespeople in general, I think this is too far. You can not expect a Salesperson to have any interest in creating flows to do this, you can expect them to click a button on a form within their D365 application.


  • The Scenario
  • Creating the Flow button
  • Cloning the Flow
  • Outcome

The Scenario

Numerous times when I have been a customer, a salesperson would come to us not knowing that we have several major cases logged with them against their product. This is mainly down to lazy sales people (I know, they don’t exist), but it would be awesome for the salesperson to get a summary of the account when they get in the door of a customer. The number of support cases, a list of the open opportunities and orders, any complaints that have been logged. All of this information is available to the salesperson via the D365 mobile app, but it would be good to ensure that they get this information and are less likely to get caught out by a customer venting at them for 5 critical bugs that have been sat around for a month.

The Solution

Flow has a new trigger, still in preview, Location, which is triggered via the Flow application when an user enters or exists an area. This is perfect for our scenario, stick a GeoFence around a customers location, when the user enters the area, it gets triggered. Look up the Customer, format an email and send it to the user.

Flow is user friendly, a low code solution, but you can not expect a salesperson to create a flow for each account they want to create this trigger for. What can be done, is put a button on a form, automatically create a Flow for the user against the account they have selected which would then be triggered when the user enters the location.

There are 2 separate series of flows that are required, firstly to start with an action from the user on the account record, which triggers cloning of a template.

The second series is the clone of the template, which triggers sending the salesperson the relevant information when they enter the customers property.

Creating a Flow Button

Starting with a CDS “When a record is selected” trigger, configure it to be used when an account is selected.

The next step is to retrieve who is running this flow. As mentioned, it will publish this button on a Account form, so it is essential to know who is running this, so an email can be sent to them. The account information and who the user is is sent as the body to a HTTP Post trigger, which is the next flow in the chain.

An HTTP trigger is used because the next Flow requires enhanced access. An admin user needs to clone a Flow, which you would not want a normal user to be able to do. The admin is used as well to ensure any runs that happen are legitimate. The admin or sys account shouldn’t belong to someone who could have Flow in their pocket.

To have the URL to send to, the next Flow needs to be created first, but just to show where this button appears within the D365 interface. The first time we run it, there are few confirmations that you need to do, finally you can run the flow.

Cloning the Flow

This flow clones an existing template, tweak it slightly and gets it up and running as the user.

Starting with an HTTP Trigger, I use a sample payload to build the schema.

Next is retrieving the account. As the account id is passed in from the calling Flow, a simple Get Record is used.

Next, configure the name of the Flow that will be created, making it unique for the user by adding their email address in. A flow definition string is also initialised for later

In this Flow, the user that called it from the button is needed, so it retrieves the profile using the Office 365 Users action.

Next, retrieve my template flow. Flow has several actions around management of Flows, which are incredibly useful to a Flow administrator. The template flow is a simple flow which has a location trigger and a call to a http trigger to call a secondary flow. I will discuss later the detail about this.

The next couple of actions try to determine if a flow with the FlowName defined already exists, firstly by getting a list of all my flows (as an admin) then getting a list of Flows in the organisation, then filtering it with the flowname that was defined in the initial steps

If there is a flow already, just stop. If not, carry on & clone the template flow.

The Template

The Log Template is a very easy, small location trigger with an HTTP call action. The HTTP call passes in the user’s location and the account id and the user who started the process. Both email and account will be swapped out as part of the clone.

The trigger region is essential for any location trigger. It triggers this one of the Microsoft campus in Redmond. Someday I will be fortunate to go to the motherland. I chose this as it is not likely that the user would have them as a client, but it doesn’t really matter where you chose, as what you need is the latitude and longitude from it so you can replace it when you clone the flow.

If you click on the peek code button against the trigger, it shows a JSON representation of the trigger. The latitude and longitude are that of the Microsoft office and this is the bit I need to replace

Cloning the Flow (part 2)

All a Flow is a JSON file. Obviously, how it is rendered and how the hooks and actions work are the power, but the definition is a JSON file. Using this knowledge, we can create a new version of the template, with a location specific to the account.

The template in all it’s glory is below. Just using simple find / replace, we tweak it to the specific location, account and user.

  "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
  "contentVersion": "",
  "parameters": {
    "$authentication": {
      "defaultValue": {},
      "type": "SecureObject"
  "triggers": {
    "manual": {
      "type": "Request",
      "kind": "Geofence",
      "inputs": {
        "parameters": {
          "serializedGeofence": {
            "type": "Circle",
            "latitude": 47.64343469631714,
            "longitude": -122.14205669389771,
            "radius": 35
  "actions": {
    "HTTP": {
      "runAfter": {
        "Initialize_Email_Variable": [
      "type": "Http",
      "inputs": {
        "method": "POST",
        "uri": "https://prod-68.westeurope.logic.azure.com:443/workflows/<GUID>/triggers/manual/paths/invoke?api-version=2016-06-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=<SIG>-JQQvYT0",
        "body": {
          "lat": "@{triggerBody()?['currentLatitude']}",
          "long": "@{triggerBody()?['currentLongitude']}",
          "user": "@{variables('Email')}",
          "account": "@{variables('accountId')}"
    "Initialize_Account_Variable": {
      "runAfter": {},
      "type": "InitializeVariable",
      "inputs": {
        "variables": [
            "name": "accountId",
            "type": "String",
            "value": "<accountId>"
    "Initialize_Email_Variable": {
      "runAfter": {
        "Initialize_Account_Variable": [
      "type": "InitializeVariable",
      "inputs": {
        "variables": [
            "name": "Email",
            "type": "String",
            "value": "<email>"
  "outputs": {}

Back on the clone flow, the next step is to convert the template to a string. This makes it easier to replace the latitude, longitude etc. with the ones we want.

On the account OOTB record there is a latitude and longitude. This data is not normally populated, but it is used by Field Service and other applications. I used Field Service to populate it using the Geo Code button.

As you can see from the above, Field service populates both latitude and longitude to 5 decimal places. This is common precision when you use any mapping software such as Google. so I am not sure why if you do the same by the Flow trigger you get precision to 15 dp for latitude and 17 for longitude.

The next 2 steps are because of me trying to get the flow to work. One of my thoughts was that the flow was expecting the all 15 of the decimal places to be populated, so these steps pad out the number you have against the account with additional numbers.

The expression is the same for both


The next step replaces the newly calculated values for latitude and longitude in the JSON definition

replace(replace( variables('flowdefstring'),'47.643434696317136',outputs('Replace_Lat')),'-122.14205669389771',outputs('Replace_Long'))

The accountid is also replaced. This is used in the cloned flow to define which account the user selected. The trigger only gives you the user’s current location, not the centre of the circle you configured. You could use these values & find the account, with difficulty, unless there is something I am missing. I prefer to add a variable in the clone, which is the account id.


The same with the email to send to, it should be the user who triggers the geofence, but seems to be the person who is the admin. As I clone the Flow with an admin account then add the user as an admin, it runs under the admin account.

There is enough info now to create this flow. Using the Create Flow action, the new flow is created and up and running.

I use a JSON expression to convert the string I have used to find / replace the latitude, longitude etc. to ensure the Flow is created with JSON.


The final step is to add a Flow owner. As the sales person who triggered the flow is who it should trigger on, make them the owner, so it should run under their context.

Outcome V1

Ignore this bit if you want to avoid the author moaning about stuff that doesn’t work.

If I run the whole flow, I do generate a new Flow.

Going into what was generated, using peek code again, you can see that the Microsoft location has been replaced with the Avanade office

The trigger is active, but this is where it stops. I can not get this to trigger to fire. Changing the location to my home, going for a walk, coming back doesn’t trigger it.

If I don’t put in the padding for the latitude and longitude, it doesnt trigger.

If I clone from my location, not changing the latitude and longitude, still the trigger doesn’t fire.

If I configure a new trigger from scratch, that works.

Everything about the trigger look the same when you get it in code view, but there must be something different.

This is where I started reaching out, I tweeted about it to the gods of flow and asked in the Flow forum where I did get a response, basically saying the same, and that the location trigger is in preview.

So, if you have got this far, how do I fix it?

Outcome V2

Like I said at the outset, this didn’t work for me. Frustration set in, and I forgot the idea. But, as I was putting together this blog post, re-deploying the components as my demo system had expired, it worked!

So, moving on, we need to sent an email to the user with the playbook for the account. I want to list the last 5 critical cases, last 5 open opportunities, last 5 notes and any description the user has put in.

It triggers an HTTP request, the schema defined by a sample payload, but contains who triggered the workflow and which account.

Then, a great time for a parallel branch. The Flow retrieves the cases, notes and opportunities in a parallel branch.

Each branch does a similar thing, looking at the Notes branch, firstly retrieve the records with a CDS List Records action, using an OData filter and order by, return the top 5 only.

Next, put this in an HTML table, selecting the output from the Get Notes action. I select Advanced option, then Custom columns, this way I can define the order and which columns I want to display.

The final step is to send an email

Obviously, this can be customised to your business need, but my example list the cases, opportunities & notes, and reminds them to fill in a contact report.


So, the user selects a button on an account form, which allows them to receive updates about one of their customers when they enter the location of the account. Easy.

I tested this with my home address and with a different user and you can see that I get the email through. Veronica is in the US, I wasn’t up at 1am writing blogs & fixing Flows.

You can also see that Flow notifies the user that it has made them an administrator on a Flow.

This Flow starts with a Flow button on a record, making it a user-initiated process. It could be triggered off a record creation process – If the user follows an Account, create this automation for them, as long as they have opted in.

There is location tracking in the Field Service application, but that requires the Field Service mobile app and not suited to a Sales person. They just need to install the Flow app on their device and forget it is there.

AI Builder – Text AI

My blogging journey started with using LUIS, one of Microsoft’s Cognitive Services to automate case assignment. This blog goes into detail about how this all hung together, using a model defined in LUIS, calling the LUIS endpoint when a new cases are created and classifying the case, by the subject, with the result from the call.

After my summer break (sorry, but family etc comes first) I thought I would revisit this scenario, but using one of Microsoft’s shiny, new AI Builder capabilities, Text Classification AI Model.


  • The Scenario
  • Training your Model
  • Getting Data
  • Publishing the Model
  • Using the Tags

The Scenario

In my first blog, I went through the scenario, so not wanting to repeat myself, but for the lazy who don’t want to click through…..

Big Energy is a supplier of energy products to end users. They have a call centre which handles any query form the customer. As a perceived leader in the sector, it is always wiling to use the latest technology to allow users to interact with them, which reduces the pressure on the customer support centre.

Big Energy has a mail box configured to accept customer emails about anything and, rather than have a group of 1st line support employees filtering out and categorising the emails based on the content, want to use cognitive services to improve the process of getting the email (the generated case) to the right team.

Using AI to file the case

LUIS does a great job of this, with a BA providing sample utterances for the model and training it.

Text Classification AI Model does it slightly differently. The model expects users to provide data (in the CDS) in the form of text blocks and representative tags for the data. Both need to be in the same entity in CDS.

On a standard Case record, the classification or tag is the subject field. This is a parent record of Case and the tag would be the name of the subject. As subject and case are separate entities, the Text Classification AI model will not work. A field, be it a calculated one, has to be introduced to enable the classification AI to work. Adding data to an entity from a parent entity breaks my Third Normal Form training (anyone remember that? Is it still a thing?).

I have raised this issue as a new idea on the PowerApps ideas forum, go there and give it a vote!

The new logic for our AI model is that the AI will classify the incoming case, adding a tag. This will trigger a flow, changing the subject of the linked case accordingly. This will trigger re-routing of the case like it did in the original LUIS method.

Training your AI

With any AI model, it needs training. The AI needs to separate the wheat from the chaff. Creating a model is simple in PowerApps.

Start at make.powerapps.com and select AI Builder, then Build

There are 4 options here

Binary Classification is useful to give a yes / no decision on whether data meets certain criteria. The criteria can be up to 55 fields on the same entity. For example, is a lead with a low current credit limit, high current account value, no kids but has a pink toe nail (shout out to Mark Christie) likely to get approved for a new loan?

Form processing is intended to assist users in automated scanned documents to prevent re-keying. An example would be any forms hand written as part of a sales or service process (before you convert to a PowerApp obviously).

Object detection assists in classification of items, be in types of drink, crisps or bikes, etc.

Text classification decides on a tag for a block of text, for example, a user could enter a review of a product online and text classification could understand what product it was for or whether it is a positive review.

All 4 of these have origins in the Cognitive services provided by Azure, LUIS being the big brother of Text Classification.

Ensure you are in the correct environment. Text Classification only works on data within your CDS environment, so don’t expect to reach out to your on-premise SQL server. There are ways to bring data into CDS, not in scope for this discussion.

Selecting Text Classification displays a form to give you more understanding, and it is here that you name your model

Hit Create and then Select Text. This will list all your entities in your CDS environment (in my case, a D365 demo environment).

Select the entity you want, Case for our PoC.

The interface will then list all the fields suitable for the AI model, namely anything that is a text field. I chose the description field, which is typically the email that the user enters when emailing in a case to the support department.

Hit the Select Field button and it will present you with a preview of the data in that table.

The next screen is to select your tags. This needs to be in the same table, and as already discussed, is a bit of a limitation to the AI builder. Less normalised data is more common in Canvas apps or SharePoint linked apps, but for structured data environments with relationships and normalised data this is a limitation that will hopefully be removed as Text Classification matures.

Also, option sets are not available, again another common categorisation technique. Multi-select option sets are an ideal tagging method too. Assume that this will come in time.

For my PoC, I created a new field, put it on the Case form and started filling it in for a few records.

Select the separator. If your tag field contains multiple tags, separated by a comma or semi-colon, this is where you configure it.

It also gives you a preview of what the tags the AI build would find using your chosen method. You can see in the No separator option, “printer; 3d” is one tag, rather than the assume 2 tags as displayed if semi-colon is selected. This depends on your data.

The next page displays a review for your data and the tags that the AI builder finds.

Next, select a language for the text field dependent on your data.

Once selected, train your model. This is where I started to run into problems. My initial population of tags was not enough. The training came back quickly with an error. There should be a minimum of 10 texts per tag, which I didn’t have. That would be hundreds of rows. How was I going to automate creating data to give the Text AI enough data to be a suitable demo?

Getting Data

I need thousands of records to train my model properly, thousands of records relevant to the tags I create. No online data creator seemed suitable, as it wasn’t specific enough, so how? A flow.

First I created a column in the Contact table to store a number for my contact, a unique no so I can randomise the selection of a contact.

Next, I need some data for the case description and the tags. This is already done as it is the same as the utterances and intents I used for LUIS, so I exported the LUIS configuration, put the data in an excel file & added a number to that.

Ready for the Flow

My simple flow is described below.

Ask for the number of cases to create, keep creating cases until you have reached that limit using a random contact and a random description.

This flow is triggered manually so I start with a manual trigger and also prompt for the number of cases to create,

The Subject variable is used later to define the reference for the subject we have chosen.

The default for loops is 60. I realised late on in the day that you can change that, but breaking up loops is good practice, to limit the scope of what could go wrong, so created a loop within a loop structure for my flow.

I restrict the inner loop to 50 loops maximum, which means the number of times I run this loop has to be calculated. If I want a 920 cases created, my outer loop would occur 45 times, each creating 50 cases. I would then do a final set for the rest.

The next steps will initialise some counters used in the loops. I also want to ensure that if the user wants to create less than 50 records, the outer loop doesn’t run at all.

The outer loop will run for the number of loops I have calculated. This is the loop condition. The counter increments as the last thing in the outer loop. The first thing in my outer loop is to reset the case counter. This is the counter for the 0-50. If we are in this inner loop, at least 50 cases will be created.

The first thing it does is get a random contact by using a odata filter to filter on the number field created specifically using a random number from 0-875 (875 being the highest number in that table).

Once the contact is retrieved, find a random description / tag combination. The data from the LUIS utterances is held in an Excel file on a Teams site. Again, a rand() function takes a random number up to the maximum in that file.

Because more than one subject row could be returned and the fact I don’t like apply to each inside each other, set the subject Id variable.

Ready to create a case now.

Nothing special. It also populates the tag field.

After some testing, to ensure that the case has the necessary fields entered, the flow was run for a thousand records without an issue.

Creating data this way isn’t quick, 20 mins for 1000 records, but it is easy and allows you to bring in reference data quickly. Superb for PoC environments.

Training your Model (with data)

Once this data is generated, it was time to re-train my model. It ran through with success this time.

The model is 97% sure that whatever I throw at it, it should be able to match it against the tags. There is a quick test option here too, which allows entry of a sample phrase to check your model

All ready to publish.

Publishing your Model

Publishing the model allows it be used within Flow and PowerApps.

Clicking Publish generates a child table of the entity you first chose where the predictions are stored. The documentation states the table will be TC_{model_name} but it created mine with gobbledegook.

The link on the form helpfully allows you to go straight to the entity in the new customisation interface, where you can change the label of the entity.

Also, it is useful to change some of the views, particularly the associated results view. By default it includes name & date, which is pretty useless, so add the tag and the probability.

As this is a child table of Case, it is by default visible in the case form Related navigation item. In the classic customisation interface, you can change the label of this view.

As it is published, users can use flow and the Predict action to predict the tag for a given section of text, useful if you want to do stuff before it reaches an environment.

Now that it is published, you need to allow the model to run. This means it runs every time there is a change to the text field. This is all done via Flow, so will use your flow runs. It stores the result in the new entity.

If a case is created now, it automatically creates the tag secondary record.

Using the tags

As AI builder generates a record for you with its prediction, and the data is in CDS, it is a simple Flow to utilise that. As it creates a record in the AI Tags table, update the corresponding case to change the subject accordingly.

Simple trigger when a record is created. The first action is to find the subject from the tag.

Update the case record with the subject and the tag so the AI can be retrained later.

That’s it. Replacing LUIS with a more user friendly environment is definitely a tick in the box for Microsoft. The AI in PowerApps feels like a simple, user friendly stepping stone for a lot of businesses into the AI world. Hopefully, businesses will embrace these simple models to leverage tools to shortcut processes, improving Employee and customer experiences.

Adaptive Cards – Improved Approvals (Part 2)

Continuing on a walkthrough of creating a more effective adaptive card for approvals, this part will describe the flow I created to generate the card as well as complete the action in D365 depending on the response


  • The Scenario (Part 1)
  • Preventing progress of an Opportunity ( Part 1 )
  • Using Flow to create a basic Approval ( Part 1 )
  • Creating an Adaptive Card ( Part 1 )
  • Using Flow to create the Approval (This Part)
  • Updating the Opportunity (This Part)

Starting out

As previously described, the Flow is triggered when a user updates the Develop Propsal checkbox. In the first stages, the flow also retrieves some records that are needed later on for population of the card. There are also initialisations of 2 arrays that are used to populate the approvers and product lines on the card.

The next section is used to retrieve the approvers for the territory. In part 1, a many to many relationship was added, linking User to Territory via the territory approvers table.

As the territory approvers table is a many to many relationship, it does not appear as a standard table in the common data service connector, nor the D365 connector. There are various blog posts out there which state you can just use a custom value, naming the table, but I couldn’t get it working, so I fell back to my custom connector.

In my previous post on Security roles via a PowerApp, the custom connector which allows an FetchXML string to be sent against an object is used a lot to get the teams and the roles for a user. This connector is again used to find the users associated with a territory via the new relationship. The FetchXML is below.

<fetch top='50' >
  <entity name='systemuser' >
    <attribute name='internalemailaddress' />
    <attribute name='fullname' />
    <link-entity name='cc_territory_approver' from='systemuserid' to='systemuserid' intersect='true' >
        <condition attribute='territoryid' operator='eq' value='@{body('Get_Account_Manager')?['_territoryid_value']}' />

 This will return JSON which corresponds to the users linked as approvers to the territory.

    "@odata.etag": "W/\"3421832\"",
    "internalemailaddress": "veronicaq@CRM568082.OnMicrosoft.com",
    "fullname": "Veronica Quek",
    "systemuserid": "824da0b2-6c88-e911-a83e-000d3a323d10",
    "ownerid": "824da0b2-6c88-e911-a83e-000d3a323d10"
    "@odata.etag": "W/\"1742271\"",
    "internalemailaddress": "danj@CRM568082.OnMicrosoft.com",
    "fullname": "Dan Jump",
    "systemuserid": "e3b305bf-6c88-e911-a83e-000d3a323d10",
    "ownerid": "e3b305bf-6c88-e911-a83e-000d3a323d10"
    "@odata.etag": "W/\"3422353\"",
    "internalemailaddress": "CarlC@CRM568082.onmicrosoft.com",
    "fullname": "Carl Cookson",
    "systemuserid": "113f1e3a-db90-e911-a822-000d3a34e879",
    "ownerid": "113f1e3a-db90-e911-a822-000d3a34e879"

 An approval needs a list of email addresses, separated with a ; . To achieve this, firstly put each of the returned email addresses in an array, then use the Join function to create the string used for approvers

Populated the Main approval

The next part the body of the approval that is going to be sent. I’ll link the full version of this at the end of the article, but effectively, you copy your design, remembering to insert appropriate dynamic content on the way.

Here, I create the 2 URLs that are displayed in the card, which combine the starting point of url and append Account or Opportunity Id.

This is displayed at the top of the card.

Further, formatting currencies is difficult in Flow. (I stand to be corrected). I found this post on Power Platform community which highlights the issue and degvalentine has the solution, which I have tweaked to take into account of null values in the fields in D365. This example is for one of the fields on the secondary grid.

if(empty(string(items('Add_to_Prod_LInes')?['manualdiscountamount'])), '0',
        max(0, sub(length(first(split(string(items('Add_to_Prod_LInes')?['manualdiscountamount']), '.'))), 3))
        first(split(string(items('Add_to_Prod_LInes')?['manualdiscountamount']), '.')),
        max(0, sub(length(first(split(string(items('Add_to_Prod_LInes')?['manualdiscountamount']), '.'))), 3)),
        min(3, length(first(split(string(items('Add_to_Prod_LInes')?['manualdiscountamount']), '.'))))
    first(split(string(items('Add_to_Prod_LInes')?['manualdiscountamount']), '.'))
    contains(string(items('Add_to_Prod_LInes')?['manualdiscountamount']), '.'),
      last(split(string(items('Add_to_Prod_LInes')?['manualdiscountamount']), '.')),
        less(length(last(split(string(items('Add_to_Prod_LInes')?['manualdiscountamount']), '.'))), 2),

Populating Product details

As the approval body is built up, the next stage is to create a table with the product lines in it. Getting the lines is a simple filter query using the primary key on Opportunity.

Like with the approvers, an array is populated with a formatted version of each line, taking fields returned and combining them with formatting rules.

The first expression deals with the fact one of the products chosen to demo had a double quote (“) in it, which messes up JSON if it isn’t escaped as it is the string delimiter. I used a simple replace expression to add a “\” before it.

replace(items('Add_to_Prod_LInes')?['productname'], '"','\"')

The next expression is the one above to format the currency with the appropriate commas and decimal places.

The output of this looping of the product lines is then combined using a join again, then combined with the body main string.

The bottom of this string starts the list of actions, which are the buttons.

The next step is to create the Approval. This is pretty simple, using a first to respond type, and fleshing it out a bit, so if a user uses the standard Flow Approval interface, they have something to relate to. No notification is needed, this will send an email to the approver, but the Flow will alert the approver via Teams.

My original design for this PoC was to push this notification / approval to a Team channel, one notice to the Approvers channel. As Teams integrates with D365, it did not seem much of a hop to highlight the Opportunity approval.

The only issue is that approvals don’t work in team channels, only when sent to a user. Until this is resolved by MS, you are limited to sending the approval to an individual in Teams.

Sending the Approval

The key bits of this action is ensuring you have the Approvers tenant (can you post approvals across tenants?), the respond link, the approval ID and the creation time populated with the data coming from the approval. The same goes for the Reject action.

That’s it, the new approval is sent. The full JSON I have produced is here

Waiting for the Approval

As the approval is configured that anyone could approve or reject, the next action is to wait for that approval to happen. Approvals can happen upto 30 days, which is another issue, but as this is to speed up the approval process, let’s not worry about that.

If the outcome is approved, then the field Complete Internal Review is checked and a note is created, linked to the Opportunity logging who approved it.

This is in a loop, as, in theory, there could be more than one approver on an approval, if you use the Approval setting that forces everyone to approve something.

The Regarding / Regarding type, highlighted above, need to be populated as you get orphan records and can spend 20 minutes wondering what is wrong (not me obviously)

On the Reject side of the condition, the Opportunity is put back to the state it was in before the flow started, namely Develop Proposal is reset. This triggers our Flow again, but as long as the first condition is met, it won’t go any further. A note is also added, to highlight who rejected it and why.

Adaptive Cards – Improved Approvals (Part 1)

Adaptive cards are relatively new to the stack of tools available to PowerPlatform users, emerging from Message Cards. They are a great way of interacting with users who are not a typical D365 user, those on the periphery who are interested in the data but not the detail.


  • The Scenario (This Part)
  • Preventing progress of an Opportunity (This Part)
  • Using Flow to create a basic Approval (This Part)
  • Creating an Adaptive Card (This Part)
  • Using Flow to create the Approval
  • Updating the Opportunity

The Scenario

Big Energy is going well, they are now involved in some big deals for big enterprises which need a lot of time to land. The proposals that are generated are complicated, and they struggled with some dubious sales people reducing the margins just to get the deals and this is just bad for business.

An approval process needs to be implemented, where one or more of a designated group of individuals per territory review the opportunity and decide if the margins are appropriate.

Unfortunately, the approvers tend to be very busy senior directors, who use D365 sporadically, if at all, and Big Energy need to allow them to approve the opportunities where ever they are using Outlook or Teams as the preferred option.

Tweaking the standard Sales process

Microsoft provides a Business Process Flow for Opportunity management, and in our scenario, only the approvers should be able to check the boolean Complete Internal Review. This is part of the standard Propose stage of the BPF.


To “lock” (I know it isn’t foolproof, what is?) the progress on Propose, the Complete Internal Review is subject to a simple business rule, if the opportunity is at any stage, lock the field.

Now, no one can edit that field, if that field is made mandatory to progress the bpf stage, no one can progress the stage past propose.

Territories are often used in Sales to group accounts or account managers and in our scenario, there is a set list of approvers for a territory. I have added a new many – to – many relationship for this, Approvers and ensured it is listed in the user form as one of the relationships

Using Flow to create an Approval

In the standard Propose stage, there is another boolean that is of interest, Develop Proposal. The Flow is triggered when this value is changed. A simple CDS update trigger is the starting point.

The next stage is to confirm that this trigger is coming with the correct record state, the record has been marked with Develop Proposal, but the other field, Complete Internal Review is still empty.

The flow to create the adapted card is fairly intense, well, from my experience, as you will see, so for now, create an Approval using enough details to get the default experience that can be built on.

In Details, there is a lot you can do, using markdown but this is not as comprehensive as the formating you get from adaptive cards.

When this flow is run, you will get an email to the assigned to with a simple, standard approval, which is in itself, an adaptive card, but it is fairly plain.

Using the Flow history, this action also shows the adaptive card that was built

Copying this value into the Adaptive card designer JSON section gives the format for a basic design, which can be augmented to show some proper information

Building an Adaptive card

Adaptive Cards are a means to interact with your users via email, teams or any other app that handles the rendering of them. They have actions, allow images to be presented and can format text in a markup that imitates a comprehensive website. They are supported in Outlook mobile apps as well as O365, either using the main app or online.

They work by rendering a JSON object, which can be formatted to match the host application (the dark black Teams theme for example renders it very differently, but the core actions are still there.

Microsoft has built a superb tool to manage Adaptive cards, the new version, at adaptivecards.io/designer. This site has lots of examples to get you started, the Expense Report is a good starting point from a design point of view, but the standard approval card forms the base for the card. There are bits in it that you need to incorporate into your card to allow the approval to work.

The parts in the data section are the essential bits that, in our adopted JSON need to be duplicated or populated by Flow to allow our card to act as an approval.

My card is a bit different than the standard, displaying key parts of the Opportunity and the associated product lines.

As you can see, there is a lot more information on what is happening on the opportunity, probably enough for a sales manager to make a decision in most cases. Included in the card are links to the Account and Opportunity if further review is needed.

I would recommend starting from a sample and building your content, with dummy data, so you can get the layout correct.

Each of the buttons are also cards on their own, allowing a comment to be made before the approval is approved or rejected.

These have been copied from the standard adaptive card produced by the Flow approval so that the submitted approval works like a standard approval.

Some considerations and limitations

I first started trying to reproduce the Expense Approval card in full from the samples

This has a great use of hidden / visible sections of the expense lines which could give you a lot of real estate for Opportunity lines. Unfortunately, these are not rendered in Teams.

Also, I thought I would be able to use HTTP trigger, but again, any button with an HTTP trigger is ignored in teams, you are only allowed to create actions for opening URLs, submitting, hiding parts and showing a secondary card.

Below the main part of the designer is the JSON, which is created by any changes you make above but also can be edited and reflected in the visualiser. The snippet below is taken from the standard card, which contains all the bits that need duplicating to ensure the new, improved approval works correctly.

   "actions": [
            "type": "Action.ShowCard",
            "title": "Approve",
            "card": {
                "type": "AdaptiveCard",
                "body": [
                        "type": "TextBlock",
                        "text": "Comments",
                        "wrap": true
                        "type": "Input.Text",
                        "id": "comments",
                        "placeholder": "Enter comments",
                        "maxLength": 1000,
                        "isMultiline": true
                "actions": [
                        "type": "Action.Submit",
                        "title": "Submit",
                        "data": {
                            "Environment": "Default-2821cf92-86ad-4c7b-ba9a-5c79a70d4a21",
                            "ApprovalTitle": "Appoval required for Opportunity",
                            "ApprovalLink": "https://flow.microsoft.com/manage/environments/Default-2821cf92-86ad-4c7b-ba9a-5c79a70d4a21/approvals/received/6cce94f6-603c-40e7-adb6-8b20c75f724f",
                            "ApprovalName": "6cce94f6-603c-40e7-adb6-8b20c75f724f",
                            "ItemLink": "https://.crm.dynamics.com/main.aspx?newWindow=true&pagetype=entityrecord&etn=opportunity&id=b7c47c42-a290-e611-80e3-c4346bacba3c",
                            "ItemLinkDescription": "Opportunity for  7-Eleven and Udaside label - ",
                            "OnBehalfOfNotice": "Requested by Carl Cookson <CarlC@CRM.onmicrosoft.com>",
                            "CreatorName": "Carl Cookson",
                            "CreatorEmail": "CarlC@CRM.onmicrosoft.com",
                            "CreationTime": "\"2019-07-03T14:30:02Z\"",
                            "MessageTitle": "Appoval required for Opportunity",
                            "Options": [
                            "SelectedOption": "Approve",
                            "ActionType": 1
                "$schema": "http://adaptivecards.io/schemas/adaptive-card.json"

User Admin PowerApp (Part 1)

Sorry it has been a while since my last blog post, this scenario has taken a while to get it to the state where I was happy to show it off. Mainly due to my own lack of understanding of the intricacies of the D365 API, but also been busy external to the blog, you know real life.


  • The Scenario (This part)
  • Notifying the manager of a new Employee (This Part)
  • PowerApp to display and update User Data
  • Update Roles and Teams

The Scenario

Big Energy Co is going from strength to strength, presumably because of the innovative solutions using LUIS , Alexa and IFTTT.

The HR department is ramping up recruitment and new teams are being shaped to support all the growth.

One of the criticisms from the managers is that it takes a while for the IT / D365 administrators to get users in the correct teams and security roles so they can be effective in D365.

A clever chap in the management team suggested that they be given an app that would allow a manager to update the roles and teams (and other relevant parts of a user) without resorting to logging into D365 administration. Something they can use wherever they have WiFi or a data connection.

It would also be good to get a notification when they have a new employee, or someone is added to their reports.

The Flow

This flow is quite simple, trigger an email when a user has the Manager field (parentsystemuserid) field updated. O365 will create the user for us (assuming you are in the cloud) and an administrator will still have to update the users manager.

Here, the attribute I am interested on is parentsystemuserid.

Next, just check to see if the manager is actually populated. In a lot of businesses, removing their manager is part of the process on off-boarding an employee, to tidy up selection lists etc.

Then, get the manager user record from D365 so that the email can be sent to it.

Told you it was simple. I am sure that this can have more logic – Do we need an approval step before assigning this user? Do we have to wait for HR to do some work and only activate the user once all the checks are done?

Next, I’ll step through the PowerApps set up to retrieve data from my reports.

Creating Attachments in D365

This isn’t one of my usual posts as I always like to start with a business case. I intend to create a blog about the business case / solution that created this issue once all the other  pieces fall into place, so watch out for that (keep them keen by teasing they say)

The Problem

You might not know there is a problem, but there is. Squirrels are a problem, but not my field of expertise, @dynamiccrmcat will have other opinions.

Attachments are a problem. More specifically, trying to add an attachment to D365 via a Flow or PowerApp is a problem. Yes I know we should be using SharePoint or Teams etc, but sometimes you want to keep your data in D365.

You should just be able to “Patch” the Notes entity in a PowerApp with the file you want in the documentbody field of the entity, but you run into problems (bug) with objecttypecodes (it is expecting a GUID and all you know is an entity name).

@JukkaN pointed me to this article on the PowerApp forum which explains the problem in more detail and clearer than I have. This was via a conversation with him and @TattooedCRMGuy where we came to the conclusion that there is a bug in the CDS connector.

So my next thought was to call a Flow from the PowerApp, passing in the file as a parameter. This doesn’t work either. Any which way you try, it ends up with the file being passed as the link to the Azure blob the PowerApp is temporarily using to store the attachment.

Then I started using the developers friend, Google. My first thought is that if I could pass to the Flow a file rather than the URL, this would work. This thought led me to an excellent Youtube video by Paul Culmsee. He shows how to pass a Photo from PowerApps to Flow to save it to Sharepoint. He has the same issue, how to pass a file to flow, and thanks to him, I have duplicated it for saving a file to D365.

So the logic I have deployed is

PowerApps → Custom Connector → Flow → Custom Connector into D365

2 custom connectors here, this is a relative expensive solution, as the user require a PowerApps Plan 1 license for this which might be an addon to their license.

The D365 Connector

Starting at the final step, I utilised a custom connector like in my previous posts to interact with the D365 API directly. I am not going to go through the method of creating the connector, just this custom action. Again, PostMan is your friend. Using this tool, I generated a JSON template to post against the Annotations entity. Firstly, add an action to the connector. 

Next, select Import from sample, and populate as below, obviously using your own instance API endpoint.

Couple of things here, Annotation is the name of the table that Notes are stored in. “objectid_contact@odata.bind” is the single value navigation for linking Contact to the note. In the CDS connector, this field is not visible, you are expected to enter a pair of _objectid_value and objecttypecode, which doesn’t work, hence this blog post (I hope they fix it, but not too soon now I have worked it out). Finally, documentbody is the field where the attachment is stored, as a Base64 string. This is weirdly different than the binary data store used by Sharepoint.

Select Import and the first connector is done.

The Flow

The flow is pretty simple, a HTTP trigger, messing around with the inputs and sending the information to the D365 connector. Simple, but I needed the video by Paul Culmsee to guide me through. The premise being, rather than the usual approach of looking at the body of the web call, we need to take out parts from the query string and the content of the call would be a file. He does a much better job at explaining it that I do, so head over to the video to actually learn.

A standard HTTP trigger. I then use a Compose data operation to take data from the trigger using a formula based on the query parameters passed.


This states that I want find and store the filename parameter passed in the url when the trigger was triggered

The Get File Body does the same but looks at the content of the body that was passed in.


The final part is a call to the D365 custom connector, passing in the compose operation outputs

Another Custom Connector

You can’t call a webservice triggered flow from a PowerApp directly. You can call a Flow, but the flow doesn’t pass the appropriate parameters. It only deals with strings. You can’t convert your file to a string in PowerApps. I am obviously going to be proved wrong here, but I will learn. That’s one of the reasons I blog.

Hence, why you need to create a custom connector to pass from PowerApps to the Flow above.

Connectors can be created by uploading a swagger definition. Swagger is an open-source framework to document APIs and has been recently converted to the OpenAPI specification. Obviously connectors can be built from scratch, but because of the file upload that is required, a definition of the API is required.

Paul again comes to my rescue, he has a great blog post that he goes through in detail the file that was produced to support his video. In the Youtube post, he uses a tool that I can not find, but this file walkthrough was enough for me to produce my own file. I am not going to go through the detail here, Paul does a much better job.

In custom connectors, hit the +, then select Import an OpenAPI file.

Give your new connector a name and select your newly created file definition. If your file is correct, you are now presented with a pre-populated definition of your connector. The query parameters displayed include lots of configuration items that should be pre-populated from your Swagger file.

You can see the connector is not expecting a Body.

If you test the connector action there is also an extra parameter it is expecting

This File parameter is essential. Now it’s reading for use!

The PowerApp

To demonstrate the connector, create a new PowerApp. Because this is for the contact entity, use a drop down to get a list of Contacts to attach the note to from D365.

Associate this with D365 using the Common Data Service

Select the Contacts Entity, back in the properties of the control, use Full Name as the Value. Next, add a “Add Picture” control.

Also add a datatable. This time connect to the Annotation (Notes) entity in D365. Make sure you use the D365 connector though. For some reason, the Common Data Service connector does not return the Regarding as a field you can use. Select a few relevant fields, Title, Note, File Name and Document.

In the Values field, filter the datatable by the Contact that is selected in the dropdown and only show those where there is an attachment.

Filter(Notes, Regarding = GUID(ContactDD.Selected.Contact) && !IsBlank(Document))

Selecting a contact now should deliver a list of notes that are attached to that record.

To Upload a document, add a button. This will trigger the custom connector, so this needs to be added to the PowerApp. Select Data sources, add data source

The custom connector that was created earlier should appear. Select it.

In the button OnSelect action, if all is well, enter the name of the connector and action and it should give you a list of parameters you need.

The final call to the connector looks like this

    "Added from PowerApps",
    "Added from PowerApps",

The filename comes from the control within the Image upload control, the contact Id is from the selected contact, some text stuff to fill out (you could add a text control to take that input obviously) and then the image.

I refresh the notes data set after I am done so that the list has got the data.

So that’s it, a complicated solution to a “bug/feature” currently in Powerapps.

A screenshot of that squirrel note against the contact in D365, just so you know I am not bluffing.

BTW, squirrels are evil, rats with marketing, don’t believe everything @dynamiccrmcat says, though it is probably just squirrels she is wrong about.

IFTTT / Flow / D365

This will be a quick post, as the connector doesn’t take a lot of configuration.

The Business Scenario

Big Energy has a lot of sales people, all over the country, which make a lot of calls, usually on their mobiles, to potential or current customers, arranging opportunities and resolving any issues.

The Sales Director at Big Energy is concerned about the number of calls that the sales people don’t always log those calls, only when they deem it important and have no traceability about how many calls an individual has made.

Could a solution be found that would log every call a sales person makes to a number D365 knows about?


IFTTT (IF This Then That) is a free web service that allows users to create applets to connect their devices with their services. There are lots of sample applets to automate tasks, my favourite is linking Alexa’s shopping list with Tesco to add everything I add to Alexa to my Tesco order. Simples.

IFTTT works with a lot of devices and it has a stand alone app for smart phones, allowing interaction with the device.

Getting Started

Log in to IFTTT and go to My Applets, then New Applet. Hit the big blue this

Search for phone, and select Android Phone Call (sorry think this is only Android users)

This presents you with several triggers, the IF part. Select Any Phone Call placed. You will have to make another App for received, but same logic.

The next step is to tell IFTTT what you want to do, select the big That button

IFTTT lets you search for all the available services that you can trigger from your data. Search for Web and select Webhooks.

The only option here is to make a web request.

If you have read my previous articles (if not go here or here now and shame on you), creating a Flow trigger from a web request should be straight forward.

Create the Flow

As in the previous articles, start with a generic http trigger in Flow. Firstly, select HTTP trigger, and enter a default body for the JSON. This is generic enough to allow the call so the schema passed from IFTTT can be created.

As the action, send yourself an email, with the Body of the email being the body of the request.

Hit save and go back to the trigger. This URL is the bit you need to pop back into IFTTT.

Connecting Flow and IFTTT

Back in the IFTTT Web Request, paste in the URL. I have changed the Method to Post, content type to json and included in the Body 3 ingredients (data returned by the call placed method trigger into the body. This is formatted appropriately to become a valid JSON Request

This is now ready for testing.


For testing to commence, a call needs to be logged. This means you need to install the IFTTT app on your phone, log in and check the app you created is available.

Once this is done, make a call.

If successfully, your Flow should run, sending you an email. If not, you can look in the IFTTT log by checking the activity log.

The log highlights what happened for each run, when it was updated etc. Not as user friendly as Flow, but at least you get an error code. I found I was getting 400 errors, as the format for the JSON wasn’t correct.

My email also has the information sent from the request, this is used to tell Flow properly what is expected, allowing access to the properties.

Is the Contact known?

The assumption is that not all calls made by the user will be to known contacts. Either unknown to D365 or personal calls. The first thing that is required is to find the contact. Using the List Records component, use a filter query to return all contacts that match the data coming from IFTTT with any of the phone numbers held against contact. Be careful here, as the number as entered by the user in the contact on their phone or as dialed is used here.

The next step is to check if any contacts were found. As previously, check the length of the return from the previous step has one or more contacts in it. Add the contact id to a variable if contacts were returned, terminate with grace if not.

Create the Phone Call

Creating the phone call is a straight forward call to the CDS Create Record action.

Couple of formulas here, firstly the timestamp that comes from IFTTT looks like

May 01, 2019 at 05:50PM

Flow doesn’t like this, so the expression for Due is

replace(triggerBody()?['occuredat'], ' at ',' ')

Duration from IFTTT is in seconds, in D365 is minutes. A simple divide by 60 puts in the right value


Flow is complete.

Can it be be recorded better?

The call that Flow creates is missing 2 key components, the From and To

The CDS connector doesn’t support activity parties (these fields are both party fields, the user can type and search for multiple contacts, users, leads etc to populate this normally). You can get at these via standard API calls, so back to the custom connector.

My previous post on LUIS used a custom connector to close the incident using an action. This walks through creating the connector, so I won’t repeat myself.

I will step through the specific action for creating the call, as it isn’t straight forward. Further, Postman is still your friend. The only way I managed to get this configured is relying on this great tool.

Using Postman, the format of the JSON can be defined, based on the API reference for Phonecalls. This highlights that activity parties are created, and associated with the call to create the phonecall.

    "subject": "CC Test",
    "scheduledend": "2019-05-01 12:00",
    "regardingobjectid_contact@odata.bind": "/contacts(A651968A-5660-E911-A973-000D3A3ACAF8)",
    "directioncode": true,
    "scheduleddurationminutes": 6,
    "phonenumber": "test",
    "phonecall_activity_parties": [
            "partyid_contact@odata.bind": "/contacts(A651968A-5660-E911-A973-000D3A3ACAF8)",
            "participationtypemask": "2"
            "partyid_systemuser@odata.bind": "/systemusers(C3AE1146-AD6D-E911-A984-000D3A3AC0C2)",
            "participationtypemask": "1"

Take this JSON body and copy into a new action in your custom connector.

Enter details in the general page, just enough to uniquely identify your action. Of course, you need to be a bit more descriptive if you plan on shipping out the connector.

Select Import from Sample, Select Post, the URL should be just PhoneCalls, paste the JSON above into the body

Flow does some magic and now you can update your connector. I tried testing this, but got a little confused (me of little brain) so I assumed that Flow is good and would handle it and went straight to using the connector in Flow.

Back in Flow, select a new action, custom connector, the new one you just created and the action established.

The populate as below. I know the parameters are not the best names, someone with a bit more time would have tidied this up.

These are the same formulas that were used earlier. You need to add the “/contacts(” etc to each GUID as the data bind requires it.

Run the Flow and just like that, the Phone Call is created in D365 with a Call To correctly established.

Call From is missing, IFTTT doesn’t let the Web call know who called it. I have searched, but IFTTT is meant for home grown activity, so if you have registered the app, you should know the call. The easiest way around this is to pass in a hardcoded value (email) specific to the end user so a System user can be looked up and populated in the parties field.

Alexa, Field Service and Me (Part 3) – Creating Work Orders

This is a continuation of my series on a proof of concept to allow Alexa to interact with D365 Field Service


  • The Business scenario (Part 1)
  • Create an Alexa Skill (Part 1)
  • Connect the Skill to D365 (Part 2)
  • Use Field Service to book an appointment (This Part)
  • Return the information to Alexa (Part 2)

In this final (unless I need to expand on the scenarios) part of the story, Flow will be used to take the information garnered from the user and create a work order.

Calling a Sub-flow

The original flow was all about Alexa, but what about other voice assistants? Flow is no different that any other programming languages and as such we can use the same concepts to writing decent flows, one of them being re-usability. If the code to write the Work order and book it is generic, it can be re-used when Google or Siri is connected to the application.

To this end, the Flow starts with another HTTP trigger, which is called from the previous flow.

Just like when connection Flow to Alexa, the URL is required in the child to update the caller. Create a new Flow, using the When a HTTP request is received. As content to the JSON Schema, add just enough to get you going and use the output to send an email so that the schema coming from the parent flow can be seen. This is a repart of the logic used to start our parent flow.

Once saved, the URL will be generated in the trigger step. Use this to call the child Flow from the parent. This is the HTTP action. The Method is a POST, the URI is the trigger URL defined in the child Flow. No headers are required. In the body, add in the things that are already known, just to ensure repeat calls to D365 are made. The Intent is also passed in, which has all the details about what the user wanted.

Now ready for testing, ensure both Flows are in Test mode and trigger Alexa. After a little delay, an email should be sent in the child Flow detailing enough information to create a Work Item.

Use this email to populate the JSON as previously, easily creating the schema that is required.

Creating a Work Order

Next, get the Account the contact is associated with. This is done with a call to the D365 Instance using the Common Data Service Connector.

A Work Order needs some fields to be set before it can be save, Work Order Type is one of them. This could be hard coded, but Alexa has supplied the intent. To match the Work order type with the intention in Alexa, a field on the Work Order Type was added, Alexa Intent, which is searched for in the CDS List Records action.

To make the Flow easier to manage and reduce the dual looping, the Work Order Type is returned to a variable

Once the data for the new Work Order is available, create the Work Order using the CDS Create Record connector.

Most of these are obvious, but the one that is not is the Work Order Number. In Field Service, this is automated, with a prefix and a number range. As this doesn’t work in Flow, a work number is generated using a random number, using an expression.


A few other parts of the work order are populated, helping the service manager to match the time as appropriate.

Alexa, Field Service and Me (Part 2) – Linking to Flow

This is a continuation of my series on a proof of concept to allow Alexa to interact with D365 Field Service


  • The Business scenario (Part 1)
  • Create an Alexa Skill (Part 1)
  • Connect the Skill to D365 (This part)
  • Use Field Service to book an appointment (Part 3)
  • Return the information to Alexa (This part)

In this post I will be linking Alexa to D365 and returning some information back to the end user.

Receiving information from Alexa

Alexa interacts with the outside world with a HTTPS request. Once Alexa has determined that it understands the user and they have asked for something of your skill, it posts to the web service you have configured.

That API Guy had a great post where he links his Alexa to his O365 email account and has Alexa read out his new email. This article steps through linking Alexa and Flow, and it showed me how simple that part of the integration is. Microsoft has done the hard work by certifying it’s connections, you just need to create one.

Flow provides several methods of subscribing to HTTP events, but the one we are interested in is at the bottom

The URL is generated for you, and is the bit you need to post into Alexa Skill configuration once we have created the Flow. In the JSON schema, we just want a stub for now. The schema is defined by Alexa and how you configure the skill. It is best to get the connection up and running and use the initial call to substitute the schema later.

Every Flow needs at least one action to be able to save, so to aid testing and understand the JSON sent by Alexa, the first step is to send an email to myself with the content of the call.

Saving the flow and go back to the trigger

A specific trigger URL for our Flow is now generated.

Back in Alexa

In Alexa, on the left in the Skill is Endpoints. Alexa allows different endpoints to the skill depending on the Alexa region as well as a fall back. As this is a POC, using the default is appropriate.

The important part of this is the drop down below the URL copied from Flow, this needs to be the second option “My development endpoint is a sub-domain of a domain that has a wild card certificate from a certificate authority“. Basically, Microsoft has certified all their Flow endpoints with wild card certificates, allowing Amazon to trust that it is genuine.

One saved, Build your skill again. I found every time I touched any configuration or indeed anything in Alexa, I needed to rebuild.


Ready to test. Alexa allows you to test your connection via the Test tab within your skill.

You have to select Development from the drop down. You can also use this interface to check Production skills.

A word of warning, everytime you build, to test that new build, I found I had to re-select Development in this window by toggling between “Off” and “Development”.

Back in Flow, test your flow by confirming that you will conduct the action.

In the Test panel in Alexa, enter a phrase with your Invocation and Utterance in Alexa entry and if Alexa understands it is for your invocation, your flow should be triggered! Alexa will complain, as our simple flow hasn’t returned anything to it. We’ll worry about that later.

In our simple Flow, I used email as the action, as can be seen below.

This is the raw JSON output of the Alexa skill and this is used to tell Flow what to expect. This way it will provide the rest of the Flow with properties that are more appropriate.

Back in the Flow trigger, select Use sample payload to generate schema, which presents the text entry, where you paste in the email body.

Flow does the hard work now and presents you with a schema for what Alexa is sending your Flow

Authenticating the user

Taking triggers from Alexa, hence a user is all well and good, but they expect a rapid response. Alexa Skills are public domain once you publish it, anyone can enable your skill, hence a little work needs to be done to understand who is calling our skill and whether Big Energy Co supports them.

Which User is it?

The request from Alexa does include a unique reference of the user, but normally businesses like Big Energy Co work on email addresses, which you can get from Alexa, but the user has to give you permission. This can be pre-approved when the end user installs the skill or you can ask for approval.

Alexa needs to be told that your skill would like this permission. This allows Alexa to request this permission for you when your skill is enabled by the user

On the left hand menu, there is a Permissions link, where our skill asks for the email address

To ask Alexa for an users email address is a seperate web call with properties from original message sent from Alexa in the trigger.

Alexa provides each session an access token so that Flow can ask for more information from Alexa in the context of the users session, location, names, email etc. As this is a globally distributed system, the api End point can vary depending on where the user started their session. The URI at the end asks for the email address of the user of the session.

Alexa may refuse to return the email address, because the end user has not given permission for our skill to share this information. If the call to get the email was a success, it means that Alexa has returned the value and the Flow can continue

Without the permission, the call fails and so the Flow captures this. Remember to set a run after for this step to ensure it still runs on failure. Otherwise the flow will fail not so gracefully.

Alexa provides standard permission request functionality, the Flow responds to the original call with this detail in a JSON formatted string, with the permissions the Skill wants listed. Finally, for this failure, let the Flow terminate successfully. The Flow can’t continue without the information.

Does Big Energy Co support the user?

It is not enough that the user has asked us for help, they need to be known to Big Energy, as a contact. Querying D365 for this data is the next step

Using the D365 connector, query the contact entity for the email address that Alexa has given us.

A conditional flow checks to see if the return from the Retrieve Contact step has a single record using an expression below


Responding to the user, quickly

Like any Alexa Skill, the user expects an immediate response. Flow in itself is quick, but when it comes to updating or creating records, it can take seconds. The timeout for your response to Alexa is 10 seconds, which isn’t a lot when you want to look up several things to create the appropriate work order.

To get around this, respond to the user once you know you have all the information you need to create the appropriate records in D365. Here, the Flow responds if the contact is in our database. In production, you could do some more checks or just assume that if the contact is known, the logic could create an opportunity to sell them a contract if nothing else. Equally, terminate gracefully after responding that Big Energy Co doesnt know the customer, prompting them to ring the service desk.

In the switch statement, a response to the user is built up. Specific, personalised responses using the data you have retrieved is essential to give the customer an understanding that you have received the request and will respond.

The responses are short and to the point but personalised to the customer and what they asked for with some expressions to add more information if they told Alexa.

This snippet adds after “request for a Service” what the user asked for a service on, relying on the JSON formatted values.

if(equals(triggerBody()?['request']?['intent']?['slots']?['device']?['value'], ''), '', concat(' for your ',triggerBody()?['request']?['intent']?['slots']?['device']?['value']))

This snippet adds to the end a sentence including the date that the user asked for.

if(equals(triggerBody()?['request']?['intent']?['slots']?['date']?['value'],''),'',concat( 'We will endeavour to send an engineer on ',triggerBody()?['request']?['intent']?['slots']?['date']?['value']))

Finally, for the Alexa response part, the Flow returns a response to the user. This combines the body with a little standard text. This is what Alexa will say. You can also add a response to the screen for those devices able to do that.

The final part of this flow goes on and creates the work order. I seperated out the flows using a sub flow, which is discussed in the next part of the blog.