Incident App (Part 2)

In the first part of this series, here I established an application I wrote to improve incident management at the club I am proud to chair.

Requirements

  • Centralised, secure list of incidents – (Part 1)
  • Ability to add divers to list for trials or competitions (Part 1)
  • Data entry must be easy and not time-consuming (Part 1)
  • Weekly notification of all incidents to the welfare officer
  • Email to parent or guardian of the diver when an incident is raised
  • Escalation to welfare officer and others in the organisation for serious incidents
  • Not cost anything

In this part I will walk you through the flows I used to notify parents and the welfare officer. There is also a requirement to escalate, notify immediately for serious incidents.

Weekly Notification

To allow our welfare officer to monitor all incidents and any patterns, it was requested that they receive a weekly overview of all incidents raised that week.

To do this, I created a flow using the recurrence trigger.

Using these options, the flow will trigger at 3 am every Sunday morning.

I then establish a variable for the email body and go off and retrieve all the items that have been created since last week. The filter query is used here to filter on all incidents that were created after 7 days ago, a simple addDays function used to get to the date 7 days ago

Next, just check to see if there are any incidents this week by checking the length of the returned list.

If this is greater than 0, then incidents have been created, go on to create an email, if not, just do nothing.

The next part is where I had to be a little bit clever, as the standard thing you would do did not give me enough flexibility. You could pass this list into a Create HTML function, but this put the data in a column format, where I wanted to display as a list going down the page, more of a normal email format.

Instead, I built up an email, formatting the content as I went. Starting with an apply to each, taking the Get Items from SharePoint command. The first action I used repeatedly in the creation of the email, which is stripping the html content from html aware fields in the columns of the SharePoint. Leaving these in resulted in a lot of mangled emails.

I then append to the Email Body string I created earlier a formatted HTML content.

The incident date is a simple formatDateTime function useage

formatDateTime(items('Apply_to_each')?['IncidentDate'],'dd/MM/yyyy')

The next step is because Contacted field is a multi-choice option set. Again, I loop through the contents of the Contacted? field and append to the email body variable.

The final parts of the loop finish of the content for each incident

Finally, send out the email. Subject details how many incidents in the week with a count from the SharePoint items returned.

Email the Parent / Guardian

The next requirement was to notify the diver’s parent or guardian. For years, we had relied on slips of paper, which again is a GPDR nightmare. Using email was a sure fire way to ensure we have done our duty. This only applied for minor incidents, parents will certainly be involved a lot quicker for anything serious.

Firstly, trigger when a new item is created in the SharePoint list

Next, get the diver that is indicated in the list, diver id being the linking record between the incident list & members

As this SharePoint get items call may return more than one item, we have to place in a Apply to Each

Check to see if the diver has an email address (if new divers come on board, this might not be the case until we have their full details).

If no email, stop the action, otherwise send an email to the email to notify the parent.

Escalate an Incident

Most incidents, thankfully, require little follow up or after care. But, there are occasions that require our parent association to be notified along with insurance companies. We also have an internal follow up process for such incidents by our welfare officer.

Rather than create a new flow, after the flow above, I continue. Does the Contacted multiple select option set contain Ambulance or Police, both mean an immediate escalation. This is done by looping through the options selected and updating a variable if one of them matches either of the 2 conditions.

Further, there is a severe boolean on the form, so if that is triggered, also update the variable to true.

Finally, if Severe has been set, send an email

Alexa, Field Service and Me (Part 5) – Using Azure Service Bus

In my previous post I walked through swapping out Power Automate with Azure Functions for responding to Alexa in our Field Service scenario.

This post is about finalising the code by using Service Bus function to create the Work Order.

In the Power Automate version I used a child flow call to allow the response to happen quickly, without waiting for the numerous calls to establish data then creating the record. This is standard, good practice to improve the response time.

Create the Queue

Firstly, over to portal.azure.com & create a service bus. The service bus is the messaging underpinning for our queue.

Next create a queue within the service bus. Queues are a list of things that your code needs to process. Functions can then be subscribed to the queue to be triggered when something enters the queue

Adding to the Queue

Back in our first function, you only require a few lines to add an item to the queue. The first line gets an environment variable like previously, from the local json file or the application settings within the Azure function.

The object I create here is to simply pass all the details I need to the queue.

var queueClient = (IQueueClient)new QueueClient(Environment.GetEnvironmentVariable("ServiceBusConString"), Environment.GetEnvironmentVariable("queueName"), ReceiveMode.PeekLock, null);
var fsObject = new
{
  email = emailAddress,
  contactId = (Guid)jContactResponse["value"][0]["contactid"],
  intent,
  date,
  device
};
Task messageReturn = SendMessagesAsync(JsonConvert.SerializeObject(fsObject).ToString());

I then call the function SendMessagesAsync, converting the object to a string as I do it. The SendMessageASync is below

private static async Task SendMessagesAsync(string messageToSend)
{
  try
  {
    Message message = new Message(Encoding.UTF8.GetBytes(messageToSend));
    Console.WriteLine("Sending message: " + messageToSend);
    await queueClient.SendAsync(message);
    message = null;
  }
  catch (Exception ex)
  {
    Console.WriteLine(string.Format("{0} :: Exception: {1}", DateTime.Now, ex.Message));
  }
}

Call the method uses an Asynchronous call which I don’t wait for the return. I just assume it works and get on with responding to the user.

Reading the Queue

To read the object from the queue, you need to register the function as a subscriber to the queue.

To do this, the function trigger needs to have a certain format

[FunctionName("AddToFS")]
public static async void AddToFS([ServiceBusTrigger("ccalexa", Connection = "ServiceBusConString")] string myQueueItem, ILogger log, ExecutionContext context)
{

The parameters to the function connect to a queue called ccalexa & a service bus indicated in the application variable “ServiceBusConString”. This signature shows that Microsoft is thinking about moving between environments from the start.

The next part of the function defines the parameters for the call to D365. This leads to parsing the object that is being found on the queue.

JObject woObject = JObject.Parse(myQueueItem);

Guid contactId = (Guid)woObject["contactId"];
var email = woObject["email"];
var intent = woObject["intent"];
string date = (string)woObject["date"];
string device =  woObject["device"].ToString();

Once we have the detail of the item sent in, we can go to D365 and retrieve some records we need to create the Work Order, firstly to the Contact, to retrieve the account associated with it.

var contactResult = await d365Connect.GetAsync("api/data/v9.1/contacts(" + contactId + ")?$select=_parentcustomerid_value");
if (!contactResult.IsSuccessStatusCode)
{
  return;
}

If the return is not success, something went wrong. Forgive me for not doing some proper error trapping here. Next, we work to get the Default Pricelist, from the account & work order type from the intent passed in

JObject contactObject = JObject.Parse(contactResult.Content.ReadAsStringAsync().Result);
var accountId = contactObject["_parentcustomerid_value"];
HttpResponseMessage accountResponse = await d365Connect.GetAsync("api/data/v9.1/accounts(" + accountId.ToString() + ")?$select=_defaultpricelevelid_value");

JObject jaccountResponse = JObject.Parse(accountResponse.Content.ReadAsStringAsync().Result);

Guid priceListId = (Guid)jaccountResponse["_defaultpricelevelid_value"];

HttpResponseMessage woTypeResponse = await d365Connect.GetAsync("api/data/v9.1/msdyn_workordertypes()?$select=msdyn_workordertypeid&$filter=cc_alexaintent eq '" + intent + "'");

JObject jwotResponse = JObject.Parse(woTypeResponse.Content.ReadAsStringAsync().Result);
Guid woTypeId = (Guid)jwotResponse["value"][0]["msdyn_workordertypeid"];

Next, we build up the object to add as a new work order. Line 2 shows binding to a pricelist record. This method is used for type & account too. I also generate a random number for the name to keep consistent with the Flow version.

JObject workOrder = new JObject();
workOrder.Add("msdyn_pricelist@odata.bind", ("/pricelevels(" + priceListId + ")"));

workOrder.Add("msdyn_name", ("AZ" + new Random().Next(4000, 500000)));
workOrder.Add("msdyn_serviceaccount@odata.bind", ("/accounts(" + accountId + ")"));
workOrder.Add("msdyn_systemstatus", 690970000);

workOrder.Add("msdyn_workordertype@odata.bind", ("/msdyn_workordertypes(" + woTypeId + ")"));
workOrder.Add("msdyn_taxable", false);

if (date != string.Empty) workOrder.Add("msdyn_timefrompromised", date);
if (device != string.Empty) workOrder.Add("msdyn_instructions", device);

log.LogInformation(workOrder.ToString());

HttpRequestMessage createWO = new HttpRequestMessage(HttpMethod.Post, d365Connect.BaseAddress.ToString() + "api/data/v9.1/msdyn_workorders");

createWO.Content = new StringContent(workOrder.ToString(), Encoding.UTF8, "application/json");

HttpResponseMessage createWOResp = d365Connect.SendAsync(createWO, HttpCompletionOption.ResponseContentRead).Result;

Finally a post method to the msdyn_workorders entity pushes this as a new work order into the system.

Connecting Alexa to our Function

This is the simplest bit. In the original set of posts, I talked about endpoints. The endpoint needs swapping to the function call.

The URL is retrieved the same way as I did in my demo, from the Azure Function properties in the Azure portal.

Update the Service Endpoint & you are good to go.

The results

When I started this challenge, I wanted to compare and contrast the response time between Flow and Functions. I would assume Functions would be quicker, but what would the difference be?

Caveats here – Both the flow and the function are my code. I am sure that there are better ways of doing both. Both follow each other functionally so it is a fair comparison.

To test, I swapped the end point and then did 5 runs to “warm up the code”. I found , particularly Azure function, took a while to come up to speed. This can be explained by cold starting of functions which will be the case in our scenario. Even flows run faster the second time through.

I then ran the test cycle 10 times then used the monitoring within Alexa to monitor response time. Both sections I checked I was getting work orders being created correctly.

The first blip in the chart is the Flow configuration. This has a P90 (90 percent of the requests where responded within this time) of over 4 seconds. It drops to a more respectable 1 second as the Flow is warmed up.

The second blip is when the configuration swaps to Azure Function. You can see this has a peek around 2 seconds for the first call. then dropping to 300ms for each subsequent call. This is a vast improvement for the responsiveness of your app.

Don’t get me wrong, I am not telling you to revert everything to a Function, it is using the right tool for the job.

Power Automate got me this far, it is great for scenarios where you don’t need an immediate response, it is great to prove that developing Alexa into your toolset is beneficial, but if you want to get serious about the user experience, in this instance, a Function serves you better.

Alexa, Field Service and Me (Part 4) – Using Azure Functions

I was lucky enough to be given a speaking slot at Summit Europe in Barcelona on this Alex subject. Unfortunately this event is now postponed until June, but I set myself the challenge to replace the Flow in this solution with a more capable (quicker) response time.

Power Automate is great, don’t get me wrong. I love that a low code alternative to connecting applications is available. I love that you can automate anything in your business with a point and click interface. But, there are times when the (ex) prodev in me thinks that this approach could lead to applications that don’t respond to you as quickly as they should.

In my Alexa solution (here) I use a Alexa to trigger a Flow that checks the user’s email, responds to the user then calls a second flow to create a Work Order in Field Service. This response takes roughly 3 seconds, which is on the edge of acceptable. My goal would be to bring this down to at most a second, using a Function.

Functions

Azure Functions are event driver serverless bits of code that can complete complex actions or orchestrations. They are built on Azure, using a consumption operation normally (you pay for the compute resources when they are triggered) so sit around costing you nothing until called. Bit like Flow in this regard.

They can be written in .NET, Java, JavaScript or Python and can scale with your usage. I have experience in a previous live in .NET, so plumped for this.

Please be warned that I am not a prodev anymore. It took me a while to get a Function connected to D365. My days of coding everyday & really understanding the intricacies here are long gone (sadly). If I have done something wrong, then I am sorry. It was just my way of proofing a point. I am sure that the performance can be improved further quite easily.

Create a Function

First you need an Azure subscription. You can get a trial with credit to complete a PoC for free. I am not going through those steps.

Secondly, choose your code environment. I have used Visual Studio 2019, as I struggled with configuring on my PC with Visual Studio Code(It was me) and when I moved to Visual Studio 2019, everything worked. I would recommend starting with Visual Studio Code, as it is free and it is definitely the way forward.

So in visual Studio, we create an Azure Function. Here I start a new project, select the Azure Function template, give the project a name, accept the standard options and I get my code snippet for a function ready to run

Just to check everything is working, lets publish to Azure & try it out!

Here, I select Publish on the project and create all new versions (the original bits are for my complete one) for Resource Group, Hosting plan & Azure storage.

It takes a while the first time as it is provisioning bits in Azure as well as compiling code, but when it is done, you can try it out in Postman.

To get the URL, hope over to portal.azure.com and search for your function. I use the top bar & search for Alexa.

On the left hand side menu, drop down the Functions Menu to show the function we created.

Top right is a link to the URL that we need to post to. This includes the website address plus a function key. This secures the API a little. This is a PoC, so this is enough for me, but in production, be wary about opening functions to the public

If you hit copy & paste this into a Postman session, or just a webpage, you will get the default response back.

If you add a query parameter like it is expecting, namely name, you will get a different response

So, this is a simple function, deployed to Azure, with a post & return. That easy!

Alexa Function – Getting the current user

As I said in the outset, my code is not perfect by any means. I expect (encourage) criticism, to expand my understanding of the subject. I don’t intend to go through my code line by line, just the key aspects.

The code is available here

The function call has an attribute which is used to define the call to the function. This tickles to your Azure Function.

[FunctionName("Alexa")]
public static async Task<IActionResult> RunAlexa(
                        [HttpTrigger(AuthorizationLevel.Function, new string[] { "get", 
                        "post" }, Route = null)] HttpRequest req,
                        ILogger log,
                        ExecutionContext context)
        {

The next part retrieve the JSON body of the trigger from Alexa and converts the body into an object from which we can ascertain parts of the request body.

string content = new StreamReader(req.Body).ReadToEnd();
dynamic alexaContent = JsonConvert.DeserializeObject(content);

The main part we want is to get the Alex access token. This allows, as I described in the second part of the Alexa blog, the retrieval of information about the user that is interacting with Alexa.

        if (alexaContent.context.System.apiAccessToken == null)
        {
            log.LogError("No Access Token sent");
            return null;
        }

This section calls Alexa api, using the apiToken that was sent and asks for the email of the user. If we get an error, as in we have not got approval, respond straight way, the same as we did for Alexa in Flow.

using (HttpClient client = new HttpClient())
 {
  client.BaseAddress = new Uri("https://api.eu.amazonalexa.com/v2/accounts/~current/settings/Profile.email");
  client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", apiAccessToken);
    try
    {
      emailAddress = await client.GetStringAsync("");
      log.LogInformation(emailAddress);
    }
    catch (Exception ex)
    {
      log.LogInformation(ex.ToString());
      var errorObject = new
      {
        version = "1.0",
        response = new
        {
          card = new
          {
          type = "AskForPermissionsConsent",
          permissions = new string[1]
          {
            "alexa::profile:email:read"
          }
        }
      }
    };
    return new JsonResult(errorObject);
  }
}

Alexa Function – Authorisation against D365

The next section retrieves a list of variables stored in a local settings file, local.settings.json or in the application settings when this becomes a proper function. This allows you not to store client secrets etc within the code and also allows you to alter where the function is pointed to for dev \ test environments.

string baseUrl = Environment.GetEnvironmentVariable("baseUrl");
string clientId = Environment.GetEnvironmentVariable("clientId");
string secret = Environment.GetEnvironmentVariable("secret");
string authority = Environment.GetEnvironmentVariable("Authority");

string getToken = await GetToken(baseUrl, clientId, secret, authority, log);

The GetToken function is below. I used the code from docs.microsoft.com here. This code steps you through creating an application user and giving it the appropriate rights, as well as how to configure the clientId & secret.

private static async Task<string> GetToken(
              string baseUrl,
              string clientId,
              string secret,
              string Authority,
              ILogger log)
{
  AuthenticationContext authContext = new AuthenticationContext(Authority);
  ClientCredential credential = new ClientCredential(clientId, secret);
  AuthenticationResult result = await authContext.AcquireTokenAsync(baseUrl, credential);
  return result.AccessToken;
}

Next is to check the email is in D365. Firstly, I configure the httpClient object, then call the api to retreive all the contacts that have a emailaddress1 equal to the email sent by the user. I only care about the first name & last name, so using the select will return only those fields.

If there is a response, I carry on, if not, I need to respond to the user again.

using (HttpClient d365Connect = new HttpClient())
{
  d365Connect.BaseAddress = new Uri(baseUrl);
  d365Connect.DefaultRequestHeaders.Add("OData-MaxVersion", "4.0");
  d365Connect.DefaultRequestHeaders.Add("OData-Version", "4.0");
  d365Connect.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
  d365Connect.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Bearer", getToken);
  HttpResponseMessage contactResponse = await d365Connect.GetAsync("api/data/v9.1//contacts()?$select=firstname, lastname&$filter=emailaddress1 eq '" + emailAddress + "'");
  if (!contactResponse.IsSuccessStatusCode)
    return null;
  log.LogInformation("Read Contact");
  JObject jContactResponse = JObject.Parse(contactResponse.Content.ReadAsStringAsync().Result);
  if (!jContactResponse["value"].HasValues)
  {
    log.LogInformation("Cant find contact");
    var errorObject = new
    {
      version = "1.0",
      response = new
        {
          outputSpeech = new
          {
            text = "Hi, Big Energy Co here. Unfortunately," + emailAddress + "is not registered with us for the Alexa App. Please contact 01234 567890 during office hours to configure your account",
            type = "PlainText"
          },
          card = new
          {
            type = "Standard",
            title = "Big Energy Co",
            text = "Unfortunately, " + emailAddress + "is not registered with us for the Alexa App. Please contact 01234 567890 during office hours to configure your account"
           }
         }
       };
     return new JsonResult(errorObject);
  }

In the flow, it is at this point, when I have enough information, I send over to a child flow. In this scenario I am sending over to a service bus. I will detail how that works in the next post, as configuring the bus etc will take time. You can always get ahead of the action by taking a look at the code.

Alexa Function – Respond to the user

Finally, we need a proper response to the user. Similar to the flow, we have a switch on the intent. I use a string that I append to and build up. The Azure parts are only for me to check that the response is from Azure version rather than Power Automate.

switch (intent)
{
  case "service":
    returnBody = "Hi, we have recieved your request for an Azure Service ";
    returnBody += device == "" ? "" : " for your " + device;
    returnBody += date == string.Empty ? "" : ". We will endeavour to book a service on " + date;
    break;
  case "repair":
    returnBody = "Sorry to hear you have a Azure broken ";
    returnBody += device == string.Empty ? "device" : device;
    returnBody += date == string.Empty ? "" : ". We will endeavor to send an engineer on " + date;
    break;
  case "emergency":
    returnBody = "Oh No! An Azure Emergency!";
    break;
  default:
    returnBody = "OK, Big Energy Azure Function has got your request";
    break;
}

By creating an object then converting to JSON, this can be used as the response.

returnBody += ". One of our support specialist will contact you within the hour to confirm the scheduled time for our engineer";
var returnObject = new
{
  version = "1.0",
  response = new
  {
    outputSpeech = new
    {
      text = returnBody,
      type = "PlainText"
    },
    card = new
    {
      type = "Standard",
      title = "Big Energy Co",
      text = returnBody
    }
  }
};
log.LogInformation("Responding");
return new JsonResult(returnObject);

Scottish Summit – A Review

The extra day a year only comes around every 4 years and it seemed appropriate that a lot of the Power Platform community came together to celebrate in Glasgow.

Scottish Summit 2020 was sponsored by Avanade and myself and several colleagues went along to enjoy the event, cement Avanade in the community, share our knowledge and learn from others.

With over 1000 people descending on the University of Strathclyde’s Technology and Innovation Centre, this event is one of the biggest, free, community organised events in Europe. Our hosts, Iain Connolly and Mark Christie have worked hard to promote the event and established it quickly as the place to be for D365 / Azure / Office users and partners to be.

First Day – Common Data Service Hack

My first day at the conference was participating in the CDS Hackathon. Chris Huntingford and Lucy Bourne from Microsoft introduced us to the requirements for the hack and 20 of us spent the day being creative in groups to resolve an environmental challenge. Our group decided to tackle our carbon footprint, highlighting how our day-to-day choices can have an impact on the environment.

Combining a Forms Pro questionnaire, a Power App for the collation of data by council employees, a Modal driven app to maintain the questions and Power BI to visualise it, we produced a solution that would be ready for use within 3 hours! This shows the power of the Low code capabilities of the Power Platform and teamwork!

Thanks to @LeeMBaker for the photo

On Friday evening I managed to catch up with a lot of the Microsoft team attending the event, discussing the partner landscape, challenges they are seeing and how we can work with them to overcome these challenges.

Scottish Summit Saturday

On Saturday, after an early start, we set out our stall. Avanade was showcasing our capabilities to the attendees with D365, across Customer Engagement and Finance and Operations. Mixed Reality has become a big topic within the community, and we were lucky to be able to showcase a HoloLens 2. This was a big crowd-pleaser, with a lot of attendees experiencing this technology for the first time.

Explaining how Mixed Reality can be used for training, 3rd line support or visualisations of products in your environments was an eyeopener to a lot of people and it was great to showcase Avanade’s capability and some customer successes.

Keynote – This is more than just technology…..

What a start! Jon Levesque, Senior Platform Evangelist for the Power Platform at Microsoft was piped into the main hall, with a kilt on to celebrate our location.

Image
Photo courtesy of @MarijnSomers (corrected)

He started with his personal journey, starting from being a member of the team that supported Steve Ballmer when he was giving talks across the world to his role now, where he is paid to fly around the world and evangelise on the Power Platform.

Jon gave pertinent stories about how the Power Platform has changed people and businesses across the world, empowering people in their day jobs to think about creating solutions to benefit their role.

His enthusiasm for the community and how it impacts the lives of everyone was infectious. Everyone left with a sense of belonging and enthusiasm to develop themselves and their businesses to the next level

Throughout the day, there were various sessions that we attended to educate ourselves.

Session 1 – To Code or not to Code – That is the Question

My first session was given by Scott Durow and Sara Lagerquist, both Microsoft Valued Professionals and at the top of their game. Their talk was about the perceived battlegrounds that have been drawn between LowCode and Pro-Dev practitioners.

Sara argued that Low-code, via Power Apps, Power Automate and the other numerous members of the Power Platform are allowing for a more reactive and time frames for users to achieve the solutions they need in their businesses. She highlighted that coding can lead to technical debt, development bottlenecks and reduced user adoption.

Scott countered these arguments by highlighting that bad solutions can come from low-code or pro dev equally, and each has its place in bring solutions to live. Gone are the days when grumpy developers can hold BAs and functional consultants hostage with their use of technical terms

But the outcome of their presentation was showing that we need to remove the barriers. Technical and functional need to co-exist and work with each other to use the right technology for the problem. Low code has great use cases to empower individuals to expand their applications across data silos, but if you want a speedy integration would Power Automate fit the bill? Encouraging Functional and Technical consultants to have conversations early on in a project and recognising the diversity of their skill sets would lead to increased user adoption and improved solutions.

Session 2 – Mixed Reality – Extending D365 on the next frontier

Kyle Hill, one of Avanade’s MVP hosted a session on Mixed Reality. He used the session to show the capabilities of Microsoft’s HoloLens 2 and how it empowers everyone within a business. From architectural uses – walking through a floor plan to first-line service staff – providing visual clues to while they are resolving a problem on customer site.

He also walked through how a low-code approach can be taken to augment the view of a user and the tools Microsoft provides to create this content.

It was a great show of how this emerging technology will soon be as widespread as mobile phones, available to all at a price point to match. I love the fact that Avanade is at the front of this adoption and are bringing this technology to businesses of all scales to change their approach to solving problems and transforming their processes.

Session 3 – Top 20 Tips for Surviving Networking at Events

Not every session is a technical one, Lucy Bourne, another Microsoft employee delivered a great session on how individuals can address their anxieties to better involve them in our community. In any community, when you begin, it is a room of strangers and lots of people are anxious about reaching out and getting involved.

Photo courtesy of @D365Geek

Lucy gave a whistle-stop guide to her 20 tips to survive these moments, gain confidence in your interactions and get involved. She highlighted that everyone has the same fears, everyone thinks they are an imposter at times and every one is unique. If you be yourself, “celebrate your onlyness”, as she puts it, and be genuine, then you can address your fears and become part of the big family we have as a community.

Session 4 – PowerApps – A Thunderbolt of Functional Awesome

To round out my day, I went to see Chris Huntingford, a Microsoft Partner Technical Architect show his audience how quick and easy it is to create a production application using low-code techniques and the tools make available to us via the Power Platform.

He explained the benefits and fundamentals of the Power Platform and how we should be engaging with it as users, empowering businesses to democratise their data. He further made it clear that there is only a limited pool of developers, but if we encourage our business users to make edge apps with the data that is available to them, companies can save time and money.

Towards the end of the session, he built an application within minutes, truly showing the flexibility of the platform.

In conclusion

Scottish Summit was a great event, it highlighted the great community we have. With speakers coming from Australia, America and all over Europe, it is an event that has started to become a must for anyone involved in the Microsoft stack.

3 Minute Feature : Episode 9 : Dashboards

A common staple of the Model driven app experience, my 9th video runs you through creating single and multi stream dashboards

Episode 9 : Dashboards

Transcript

Dashboards have 2 variants, those that are connected to an entity and those that are not. An entity dashboard, called Single Stream previouly, are designed for support desk users or similarly focused indviduals.  The multii stream dashboards are for more typical dashboard requirements, displaying MI for your users.

For an entity dashboard,  you need to start from an entity.

Use Entity, Dashboard, add dashboard, select one of the overview types, I have gone for 3 column overview, varied width.

You are presented with the dashboard edit screen. Give it a name and a view to use with the dashboard, remember this is for a single entity.

In the 3 panels at the top you can add 3 charts of your choice, from those available against that entity, and only system charts.

The stream is meant to be a focus area for a list of records to be displayed. Select a view, again for the same entity.

Select Save and close. In the user interface, if you got to the entity in question and select Dashboards, the new dashboard should be visible. It stats with the visual filter not displayed. This is the 3 charts we put in.

If you select any of the charts and drill down, the data in the other charts and the feed is filtered as well

In the top right of the dashbaord you can also alter the timeframe resulting in the records being redisplayed.

For the multi-stream dashboards, add a new dashboard direct from the solution. Here I chose a 3 column dashboard. The same editor appears. You can add components such as Charts, views web resources and even a timeline control. To make room for it, I will delete the 2 spots to the right, the add a timeline control. I can also then expand to fill the whole of the right of the frame.

Again save and close, to return to the solution view and ensure you publish all customisations.

In the user interface, our new dashboard is now available. The timeline fills out the far side and our other components are visible

 Like with Views, you can also create your own dashboards. Similarly, if you want to restrict the access to Dashboards, you need to do it with a personal dashboard and share it to restict the useage to a particular user group. All System dashboards are available to all users who have the application where the dashboard is configured.

Here I create a copy of my dashboard and then I am able to share it. I go into the dashboard and remove the pane for an account listing. Hitting save shows you my new, personal version of the dashboard.

When using Personal dashboards, ensure the users you share with also have access to any personal views or charts you add to your dashboard

3 Minute Feature : Episode 8 : Charts

Charts in a model app are a great tool for customisers to visualise data

Episode 8 : Charts

Transcript

Charts in D365 allow a visualisation of the data from a view. We discussed views earlier in the series so won’t go into that.

Starting at the Entity in your solution, you navigate to charts, select Add Chart

The new chart display comes up

We will begin with a simple column chart.

Select a Legend Entries, this is the column that will be counted or summed etc, giving you your Y axis

Next, select the series entries. This gives you your x axis bands. If you choose a date field, you can define the date period.

Now hit save and close.

The chart appears in the solution

Over in the user interface, if you select the entities and views, you will have an option to display a chart. This allows the current view to be displayed in the chart you select, getting you this visualisation

Here, you can see the chart we defined has been brought through.

If you select or one or the bars in the chart, the view gets filtered by the selected data and it also allows you to drill down the data, dividing the data into other categories.

This chart is a system chart. System charts can be part of your solution, being able to move between environments, but you will not be able to “secure” them. Either everyone sees the chart of no one does.

If you want to only share certain charts with certain groups of users, you should be using user charts. It also allows me to show you the other visualisations easier.

On the chart, if I select New, I will be creating a new user view, only initially shared with my user. The same interface appears allowing you to define the chart.

I have given this a separate name, hit save and close

Back in the user interface, on hard refresh, the new chart appears. Also, on the chart, I have the ability to share. Here I can share with other users or teams of users.

Using account, as there is more data, I will run through the options available to you. Firstly, I make a copy of the out of the box Account By Industry then select edit.

This chart starts off as a bar chart, firstly switch to a column chart. Next we can add a field to the series to show a max number of employees in each sector.

You can only have mor e than one series or legend entry, not both at one time. So next I remove the number of employees and add a second series for country region

You can see now that the bars are split up by country then industry. Swappng the series over, allows you to imagine the data in a different way

Lets looked at the other options, Stacked Bar or Column is great for you to show percentage data, but only works if the totals are roughly the same.

Areas have a different visualisation, good for time line data to show increases.

Line does nto have the overlap like area, again great to show variation in time.

You will notice that I can not choose the other types. This is because I have more than one series or legend displayed. These visualisations rely on 2 dimensional data only.

I remove the series for country to show these other options and select a simple Pie chart. This chart has a high number of records that are redundant, no industry, so it may be worth fitlering using a different view, or fixing the data.

Funnel charts are really useful to show progress through a sales cycle, changing the size of the segment depending on the count of records.

Tags display a list of the fields in the category & a count. Great for quickly visualising other data for filtering.

And finally donuts are another visualisation

This works well with small sets of types.

To limit data, either use a different view or you can grab the top or bottom number of records from the view. Here I limit to the top 3 types displayed, hence our visualisation is a lot more appealing

Incident App (Part 1)

Most of you won’t know this, but to keep me busy (as if work, blogging, presenting & community isn’t enough) I am the chair of my local platform diving club, Star Diving. Give them a look at http://stardiving.org/ if you or your son or daughter want to learn a great sport that pushes you to control your body & keep fit.

My youngest son is part of the Skills squad (very proud parent) and he loves it. To support him and mainly because I was the last to step back, I became the chair of the club last April.

Star is a registered charity, whose purpose is to promote diving across Surrey and beyond and have 200 members or all ages. As a registered charity, Microsoft generously gives Star 25 licenses for using Office 365, we host our email and use teams and other things.

The requirement

Diving is a dangerous sport to the untrained and accidents happen. Logging accidents and informing parents, in this day and age of GPDR has become a concern. Gone are the days that we can keep an accident book at the pool side and have a chat with a parent. Our duty of care needs to ensure any incidents are logged centrally to ensure repeat incidents.

On discussing with our coaches and welfare officer a list of requirements where created. Wasn’t a formal list, just from conversations etc, but like any good BA, I created a list

  • Centralised, secure list of incidents
  • Ability to add divers to list for trials or competitions
  • Data entry must be easy and not time-consuming
  • Weekly notification of all incidents to the welfare officer
  • Email to parent or guardian of the diver when an incident is raised
  • Escalation to welfare officer and others in the organisation for serious incidents
  • Not cost anything

Of all these requirements, the last one drove me in a direction which is not my normal. I am so used to firing up a CDS environment, creating an entity structure and starting the process from data.

I decided to delve into the unknown (I know the principles, like any Solution Architect, but doing is different) and create a SharePoint list or two to store the data and build a Power App on top to handle data entry. Power Automate will be used to link the app together and provide the notifications.

Introducing Pike

Pike’s front screen has a series of buttons that I will hopefully expand on as I add functionality. I have used the default black and yellow scheme, as Star’s colours are the same.

Data Entry

If you click on Create New Incident, you are taken to the first data entry screen.

This is my first learning point. I knew what the * next to the fields means, but my users didn’t. Don’t assume that just because you know how something works that others will.

This is a simple form based on the SharePoint list. The fields are formatted for the screen, and I didn’t want a lot of scrolling, so split the data entry over 2 screens. The first screen uses a second list for all our divers and displays a drop-down of them.

Further, if the diver is not on the list, for trials or competitions, you can add them using the + next to the diver

The Reporter field is a Person or Group field in SharePoint, which allows linking to a user on our environment. Location is a choice field, with diving specific terms.

Once all the fields that require data are filled in, a big Next button appears at the bottom

This is done by a simple formula on the Visible property of the button, namely, only display if the form is valid

Selecting the button passed to a second New form, to finish entering the data.

The Contacted? field is a multi-choice field and it kicks off a flow for serious incidents (if a hospital visit or ambulance is called, it is serious by default)

Again, the visibility of the submit button is controlled by the validity of the form.

The submit form doesn’t conduct a straight submit but combines the data on the two forms. Additionally, notify the user if there is a positive or negative outcome to the submission

Patch(
    Incidents,
    Defaults(Incidents),
    'frmIncidentNew-First'.Updates,
    'frmIncidentNew-Second'.Updates, {'Welfare History': "Created by " & User().FullName}
);
If(
    IsEmpty(Errors(Incidents)),
    Notify(
        "Your incident has been recorded",
        NotificationType.Success
    );
    Navigate(
        Home,
        ScreenTransition.CoverRight
    ),
    Notify(
        "There was a problem submitting your incident. Please contact incidents@stardiving.org manually",
        NotificationType.Error
    )
)

This returns the user to the home screen, where they can view incidents they logged by clicking My Logged Incidents.

Viewing Incidents

As you can see, I got bored in testing & documented a lot of gruesome accidents, which thankfully doesn’t happen.

Clicking on the arrow shows the detail.

For a normal coach, a reporter, this is the end of the functionality. But for our welfare officer and administrator, there are a couple of additional features.

Securing Special features

On startup of the application, I run a Flow.

This flow is checking to see if the logged in user is the owner of the team that owns the Incidents list.

Firstly, initialise some variables, then call the SharePoint API to check the current users rights in the given SharePoint site.

The variable in the Uri is passed in from the Power App, which is the users email address.

If this call returns a value, it means that the user is in the owner team, and this result is passed back to the Power App.

The isAdmin boolean is set after the call as shown previously.

This global variable is used on the List of Incidents screen to hide or show an icon at the top

The visibility of this is controlled by the isAdmin value.

This button toggles whether the user sees all the incidents or just their own. This allows an admin to view all incidents.

On the incident detail screen, there is a edit button as well, which allows an admin or welfare officer to add some commentary to the incident, such as when the parent is contacted or details about any investigations.

The SharePoint List

In SharePoint, you can see the recorded incident in the SharePoint list

This is a simple use of the list, and is secure. Next article, I will walk-through the other flows that are being triggered to notify the parents and welfare officers to complete the requirements.

Conclusions

It is a simple app, but it really hits the mark for being cheap (free), secure, easy & portable. SharePoint may not be the database of choice for everyone, but you can not knock it for being cheap. Power Apps just adds that extra polish to the data storage mechanism that takes it from a plain list to an intuitive application.

If there are any clubs out there that want to work with me on implementing this app at your organisation, particularly charities, give me a shout. I would be happy to help & share the application and spend time installing with you.

3 Minute Features : Episode 7 : Forms Pt2

This is the second part of the Forms feature, highlighting some useful tips & where to start when Business rules is not enough.

Episode 7: Forms Pt2

Transcript

With forms, if business rules don’t meet your requirements, you may have to resort to Javascript. These little code snippets allow you to do pretty much anything on a form that is not possible in business rules, but be warned, Javascript nowadays is frowned upon. It is code for a start, so you need a skilled professional to create and maintain it. Use it as a last resort.

Each form has a library of files you can upload to it, each file containing one or more functions. Each function needs to be associated with an event, such as the load event of the form or a change event for a field.

You can also establish form parameters, which can be used by your scripts. Say you want to ensure only if the user comes from another form, you can use this to pass history from the other form.

Non-event dependencies are a list of fields that are required by scripts. Adding a field to the list locks it to the form.

Back in the maker experience, there are few nice functions to make your life as a designer easier.

In the bottom left is a selection which allows you to visualise how your form will look at certain aspect ratios. As this is the new world of unified interface, this form will be rendered on a phone or tablet as well as a desktop. Here, as the form size is changed, the header section is displayed as the first section. Phone is also the size used for the outlook client.

Header can also be altered, fields added. As you add more than 4 fields, you get a prompt, letting you know that this new field won’t be directly visible on the form, but part of the flyout only. Opening the app, you can see this.

The default option is high density, as we just saw, but you can also choose to untick this to go back to the previous way. This means the fields are editable in the header, but take up more space. Depends on what you want to achieve in your layout. As you can see, as I was in responsive mode, the header gets shifted to the first section of the first tab

In the user experience, now that we are not in high density mode, the fields in the header are editable directly, with the 5th field being available and editable in the drop down

There are 4 types of form you can create. We have mostly been working with Main forms, the default wide forms which we are used to.

Quick view forms are those to show data from a parent entity.

We can also have Card forms, which appear when the user is displaying a list as a series of cards. Generally, if there is not enough space to render a list properly, the UI will shift to display it as a card view.

To show this, on the account form, I add a quick view form to the form, choose the related entity and the form to display.

Also, we can add a new grid to show off the card view. This, as there is little space in the thin column, uses the card view form.

Quick Create is a form which you first have to enable on the entity by editing the properties of the entity.

Once enabled, you can create a quick create form, add some fields to it and save.

When you select quick create on a grid now, it will display a form to the right of the main form, allowing you to quickly add information, and automatically link the new record to the main entity

2020 Release Wave 1: Improved Email

Part of the 2020 Wave 1 were some changes to the Email functionality, namely contextual email communication & Easier selection of email templates.

I have already been through enabling the Wave 1 release here, so won’t do that again.

Easier selection of Email templates

Previously, in an email, you can select the Template box, and it just gives you a selection of the templates.

Once you select the template, this is rendered in the email body.

Now, in 2020 Wave 1, we get a different popup

As you select a template, the right pane shows you a preview of the email, including the data that is part of the template being retrieved from the regarding record you created.

This is a simple change, but is a great user experience change, particularly where templates are heavily used. No longer will you have to do all those clicks to ensure you have the right template, simple but effective. Well done Microsoft.

Contextual Email Communication

This is another brilliant, quick win. Allowing you to switch between email & the record is excellent enhancement. You need to enable it.

Start by going to the area selector at the bottom of the site map & select App Settings

This will open the overview pane, and in the middle of that you can select Manage against Enhanced Email

Select Yes then Save then go back to an Opp form or anything with a timeline

Select the little + against the activity timeline, select Email & viola, contextual email

This little screen is a sort of popup, detached from the opp behind it, but you can interact with the form behind

See how you can copy and paste information to make a sales or service users life easier.

You can also navigate to other screens

Well played Microsoft, well played.

Email writing in D365 has always been behind that of Outlook, but this will bea big improvement to a lot of people.

2020 Release Wave 1 : Kanban board

On 3rd February, early access to the Kanban board for Opportunities was made available, and as an avid user of Kanban like boards, I thought I would give it a spin. Kanban is a method of visualising and planning activities by their status. In the standard model app you can apply a process to any record, having a step by step guide to your business process.

UPDATE

I have been playing with this further, and it seems that there is a large caveat around the process that the Kanban uses, details at the end

Enabling the Wave 1 Early Access

Goto https://admin.powerplatform.microsoft.com/, Environments, choose your environment

On the right hand side, there is now an option to Enable the 2020 Wave 1 Early Access. Here, you can see that I have enabled mine

Enable Kanban on Opportunity

The Microsoft documentation here walks you through the steps. For a new feature, having to resort to using the classic customisation interface is something that goes against the grain.

Open make.powerapps.com, select Advanced Settings from the cog top right

This opens the classic interface where you need to go to Settings \ Customise \ Customise the system. You should add this to a solution obviously, but just for PoC.

Scroll down the entities until you find Opportunity, and select it. The third tab lists the controls that are available to the Opportunity.

Select Add Control then scroll down to Kanban, select it then hit Add.

You will then hit Save and Publish.

Displaying the Kanban board

Once this is done, open up the Sales app. Select a view of opportunities. In the top right, select the 3 dots then Show As and choose Kanban.

Once this is done, your Sales process is displayed as columns with each opportunity listed

Each column has a count of the number of records and a sum of the estimated revenue for all the records at the status.

You can search the opportunities like any other view and also change it to display the opportunities by status.

The 3 fields on each record can be edited on this screen, allowing for quick editing of the key fields. You can drill down into each record which pops the main form into view.

Each opportunity can be dragged between columns. This triggers the display of the record for you to choose the next stage.

Thoughts

It is a great feature that will be handy for teams to quickly see the state of opportunities.

It is let down by not being able to customise the card display, well I couldn’t work it out. You are limited to the owner, estimated revenue and estimated close date. I can understand limiting you to 3 fields, but a customiser should be able to choose. I would assume a card form should be tied to this.

Also, when you drag and drop, even if you have the required fields, the record is displayed. Not really that quick. I would think that if no fields are required for a particular stage change, it should just happen. A bit fiddly.

Finally, the adherence to your process is excellent. I added a step to my sales process and it just re-displayed the Kanban board. No effort.

I am sure this feature will be improved over time, and will be great for those companies that don’t customise away from the Microsoft sales process too much.

Default Process only

After working with a client on demoing this in their environment, it became obvious that there is a fundamental caveat with using the Kanban feature.

If you don’t use the standard Opportunity Sales Process, don’t expect any use out of the new feature as it stands currently. Only the default process is used.

I created an idea here https://experience.dynamics.com/ideas/idea/?ideaid=2548d535-b154-ea11-b698-0003ff68c1c4