3 Minute Features : Episode 2 : Fields

This is the 2nd installment of my 3 minute feature videos, going into detail about the different field types, how they appear on the interface and some other useful stuff.

Episode 2: Fields

Transcript

Starting with the new entity I created last time, select new then field.

This pops up a form to the right, which has a display name and field name for you to enter the detail. The Name can not be changed once you hit save.

Next is the type, as you ca n see there is a lot of them. To demonstrate each type, I added one of each to a modal form and display on the right how the default for each type behaves.

There are certain properties for fields that are always present.

Required ensures that there must be data in this field when saving

Searchable means that this field is visible for all users who have entity access in views and advanced find. Usually this will be ticked

Calculated or Rollup Fields allow you to summarise other fields to suit your business needs. This is only the tip of the iceberg and I will do a video on this soon

Under Advanced options, give your field a description. The other options available depend on the field type you enter.

Here, a simple text field has a maximum length. Be conservative over your field sizes as a best practice. Text has a max size of 4000 characters.

 On the form this is rendered with only 1 line.

The Text Area has the same restrictions as Text The designer can decide how many lines to display.

Email is still a text field in the backend database, but is rendered with a control allowing the user click and email from the page

The same for URL, popping open the website.

Ticker Field links directly to the MSN Money website to give a quick view of the stock

Phone allows interaction with your telephony provider or Teams etc to direct dial the number

Autonumber effectively gives each record a more user friendly or quotable reference to the record. This is great for case management type activities. You can establish what the prefix is and the number range. It gives you a preview of the format. On the form this “number” only appears when you save the record.

The first number format is Whole Number. This does not allow decimal places to be entered and you can also establish a minimum and maximum value which helps with data entry. A warning to the user when the entry doesn’t match is given.

Duration is a sppecial field which allows the user to select a duration from a drop down, the result is stored as a number in the database.

Timezone presents the user with a list of international time codes and the database stores just a numeric representation of it.

Language is similar, with only organisation enabled languages listed

Date and time presents the user with 3 controls. A date picker and hour and minute drop downs. The Behavaiour option means that the data is stored as a Coordinated Universal Time and translates it to the user current time zone or you can say it is independent of the timezone, recorded and displayed as entered.

IME Mode is applicable for all text fields, and it allows asian characters sets to be visible or rendered. Very advanced and if it effects you, you will probably already know how to configure it.

The next type only renders a date picker, but still has the behaviour for time zones

Currency is a numeric data type, but also creates a couple of extra fields when you hit save.. As the user enters the data, it is saved in the currency selected and also calculated and stored in the base currency of your system. This allows you to rollup opportunities etc in a common currency

Customer is a special field that can either have a lookup to Contact or Account. The user can choose either. This field is useful for opportunities or other activities where sometimes you may link to one then the other.

Decimal number allows you to define min & max again, but also how many decimal places These number types are limited to + or – one hundred thousand million, with upto 10 decimal places.

File is next. File is only used, currently, in Canvas apps and flows, so not on my model driven form. It stores binary data and has a max file size of128Mb. Useful for capturing images, pdfs etc in a Power App.

Floating point number is very similar to Decimal having the same restrictions, but is only stored as an approximation to your real number. This is done for performance reasons, normally there will be no difference to the end user. On rare occasions when you are dealing with large numbers that this may be of concern

Lookup field link records together. This creates a parental relationship to the entity you select,This is a simple way of working with relationships, again a topic for a future video.

Multi-option set is a great tool for data entry. The customiser decides on the options that they want to display and it is presented as a list where they can chose one or more of them. The options are stored globally across your orgainisation, allowing reuse

Multi line text is for large paragraphs of text. You are limited to a million characters in this.

Option set is identical to the Multi version, but restricts the user to only one choice.

And finally two options is a specific option set for yes / no or on / off options. You can establish how these values are displayed, the user toggles between the two.

Now that we have gone through the options, I’ll hit save. At this point, the field is not created. It is highlighted in bold to show those that have not been saved yet. You can go back and tweak the properties before you hit Save Entity

3 Minute Features : Episode 1 : Entities

My hiatus from blogging is due to my lack of creativity around thinking about problems and how to solve them and this got me thinking about a ready source of material that which would give me a infinite number of things to write about.

As I always want to learn, and envy those that do these things, rather than a blog, I have started a video series which does a show and tell on a subject, in a set time frame.

Welcome to 3 Minute Feature Thursdays

Episode 1 : Entities

Transcript

Hi and welcome to the first of a hopefully weekly topic on the fundamentals of the Common Data Service.

There are a lot of blogs and videos out there that are documenting all the changes that are forever coming into the platform, so I thought we should take some time to ensure everyone knows the fundamentals.

I will be diving into a topic each week in some depth, but trying to do it in 2 minutes. This gives me a timeframe to work to, reduces unnecessary words and hopefully gives a refresher or introduction to the topic in a bite sized portion.

The first topic is Entities, so here we go

Firstly, go to make.powerapps.com. Once there, check that you are in the correct environment. This is essential to ensure you are adding components to the right place.

As with all development, it should start with a solution, so using the menu on the left, select Solutions, and new Solution.

Give it a name and a publisher, both allow for the ability to differentiate our solutions and to patch solutions as we move the solution between environments for testing etc.

The default version is 1.0.0.0

Now hit publish, you can see that the solution is empty, so the next step is to create an entity using the menu top left

The display and plural names are used by default in lists and records. The Name is the physical table name in SQL, so can not be changed after the entity is created. The name is prefixed with the publisher prefix.

The primary field is what is selected when the user chooses a record to link 2 records together, like Contact name or company name. You can choose to change the default name and the field name.

As we go down the properties, those with a little cross next to them can not be changed after you set it. You can set it if it is not set, but once it is set, there is no going back

A key checkbox is the “Enable attachments”. This enables the entity for attachments, basically links the notes entity to your new one.

Description allows your entity to be found later on and should describe it’s useage. More documentation the better

If you change the Entity type to Activity, it will be allowed to be treated as an activity, such as email, tasks or call. Great if your entity is a new version of these things.

If you want to restrict access to certain records in your entity, like opportunities typically, then ownership needs to be User or Team. Organisation level is used for data shared across your whole system, such as products.

In the Collaboration section, Allow Feedback links your entity to the Feedback entity, allowing users, external or internal, to feedback and rate a record. Typically used for case or survey scenarios

In the out of the box, meetings can have followup tasks that are associated with them, if your entity fits into this scenario, then enable this.

Connections allow a user to link 2 disparate records together, typically used between employees / contacts / accounts to denote relationships.

If you think that you want to be able to link your entity to incoming emails, then check Send email to Entity. The entity needs at least 1 email field. Leads and contacts are examples of this out of the box.

Mail merge and sharepoint checkboxes enable the corresponding functionality too. Sharepoint needs more configuration, but here is where you enable each entity.

Access teams complement the security via ownership. If you enable this checkbox, and set up a template, an access team for each record is automatically created when via code you add your first user to the team.

Queues are typically used in a service scenario, but allows assignment of your record to a queue to be managed by a helpdesk or group of users.

Quick create forms are useful in a lot of scenarios, where you want the user to easily create a new record with a subset of all the fields.

Duplicate detection is common on most records. This setting enables it, but you will have to configure the rules to prevent duplicates.

Flow change tracking allows Flow to subscribe to “When a record is updated” triggers on your entity. Essential in most scenarios.

The final option is enabling your entity for offline outlook. There is a lot more to consider on this, but here is where you start the journey

Once configured, hit Create and wait. You can go ahead and add fields and other stuff as you wait. Eventually you will be displayed with a nice green banner and a list of the default fields.

So that’s it.

Next time, I will dig into Fields, continuing on our journey.

Please subscribe if you find these useful and provide feedback either via Twitter, LinkedIn or my blog.

Cheers, see you next week.

D365 Org DB Settings – Other

This is the second of a series where I try to document all the rarely used settings available to your Dynamics organisation to tweak the standard behaviour.

If you want to tweak your settings, see my previous post, or if you want to use a canvas app, see this post.

ActivityConvertDlgCampaignUnchecked

Default ValueTrue
TypeBoolean
Description Controls the default value of the Record Closed Campaign Response option. When you convert an activity to an opportunity, this default option controls whether the source campaign is set or not.
False- Record Closed Campaign Response checked and source campaign will be set.
True – Record Closed Campaign Response not checked and source campaign is not set.
Min Version 5.0.9690.2720
Max Version

When you convert an activity to an opportunity, you have the option to populated a related campaign and also record a closed campaign response. This settings implies that you can default the Record a closed campaign response to true, forcing the selection of a campaign, but in my environment it isnt working.

Someone prove me wrong

ActivateAdditionalRefreshOfWorkflowConditions

Default ValueFalse
TypeBoolean
Description Enables an additional refresh of workflows that contain wait conditions and that may have to be resumed. This is required to be enabled to enable a fix that was originally released as a Critical On Demand hotfix, and was publically released starting with Update Rollup 13. When events in a Wait Until condition are met, the condition is not triggered as documented in KB 2918320 .
Min Version 5.0.9690.3445
Max Version 5.0.9690.3731

Redundant fix for what looks like an issue in a specific version

allowRoleAssignmentOnDisabledUsers

Default ValueFalse
TypeBoolean
Description Enables the assignment of a security role for user accounts with a disabled status. This allows for scenarios where stub users can be created and assigned a different security role. This is needed when a stub/disabled user account needs to own records, especially when these records are from custom entities where custom security roles are required.
False – By default, a security role cannot be assigned to users with disabled status. This is shipped by default.

True – Allows security role to be assigned to users with disabled status.
Min Version 9.1.0.5610
Max Version

There are lots of times, particularly around migrating systems, where you are not ready for users to have access to a new system but want to import their data. Think about where you are migrating department by department. The long-term goal would be for users to all have access, so temporarily assigning the records to another user or a system user may not be appropriate. Ownership is usually an indicator and first point of contact in most systems and a driver for security, etc.

Toggling this setting to true allows assignment of roles, therefore records to disabled or stub users.

Set to false, it displays this error message when you save any changes.

ChangeDoubleQuoteToSingleQuote

Default Valuefalse
TypeBoolean
Description Changes double quotation marks to single quotation marks within KB articles when the article is viewed.
Min Version 5.0.9690.3541
Max Version

This setting will do as it says, convert double quotes to single, but is only relevant for the old Knowledgebase Articles in your system, superseded by Knowledge articles that now have an appropriate WYSIWYG editor, this becomes redundant.

ClearSystemUserPrincipalsWhenDisable

Default ValueTrue
TypeBoolean
Description Clear and/or populates SystemUserPrincipals values for systemUsers when they’re disabled/enabled
Min Version 6.1.1.123
Max Version

I can’t find any information on this. Will update when I find out.

ClientUEIPDisabled

Default ValueFalse
TypeBoolean
Description Disables the sending of customer experience feedback for the organization. This option can also be disabled from the user settings area for each user.
False – Enables the sending of experience feedback.
True – Disables the sending of experience feedback.
Min Version 5.0.9688.583
Max Version

This, apparently, disables the option to send feedback to Microsoft when you get an error, great little setting, but until I get a repeatable error, can not see it in action

CreateSPFoldersUsingNameandGuid

Default ValueTrue
TypeBoolean
Description True: creates SharePoint folders using the format of {Name}+{GUID}.- false: Creates SharePoint folders using just the name
Min Version 6.0.0.809
Max Version

When you integrate SharePoint, a folder in the site is created for each defined entity, normally with the name of the record and a GUID of the record. This can be ugly and not necessary if you have account names that are pretty unique. This setting allows you just to use the record name.

This screenshot shows with the setting as the default, true, at the top and the setting as false for JKL Sales

DefaultHeightForWizardReports

Default Value8.25
TypeDouble
Description With a default value 0: CRM will use 8.25 inches (A4), any other double value will override the default of 8.25.  Some printers may reject printed reports if the height is any less than the height of the paper loaded in the tray, this setting will override the height used
Min Version 5.0.9690.3541
Max Version

Printing can be problematic with different paper sizes etc. Not really a problem with modern printers, but this setting allows you to configure default printing sizes to ensure printing works for your organisation.

DisableIECompatMode

Default ValueFalse
TypeBoolean
Description Changes the server-side, automatic, IE Compatibility Mode Flag for Internet Explorer browsers. If you want pages to render in the most recent version of Internet Explorer set this to True. If you have form scripts or other customizations that require earlier versions of Internet Explorer this should be set to False. This is also controlled via Settings | Administration | System Settings | Customization
Min Version 5.0.9690.3233
Max Version

Compatibility mode was a big thing, particularly in the IE10 / 11 era. Making sure your website properly displayed across multiple browsers with inconsistent approaches to standards can still be a big gripe, but thankfully a thing of the past, with modern browsers adhering to standards a lot better.

DisableInactiveRecordFilterForMailMerge

Default ValueFalse
TypeBoolean
Description When you perform a mail merge, inactive records are not included. This option lets you override that functionality.
False – Inactive records will not be included in the mail merge.
True – Inactive records will be included in the mail merge.
Min Version 5.0.9688.583
Max Version

Mail Merge is a deprecated function, but if you still use it, when you call a mail merge from a campaign, inactive contacts or leads are not included.

Mail merge is similar to the quick campaign, and I checked that this flag does not override the selection for quick campaigns, but it seems to have no effect.

EnableBulkReparent

Default ValueTrue
TypeBoolean
Description Disables and reparents using a one record at a time approach
Min Version 6.0.0.809
Max Version

I am not sure this is relevant any more. I assume this is for where you have large datasets and on premise, that could (with cascading) lead to large data changes for each ownership change, but with my environment, I did not notice any change between the two options.

EnableQuickFindOptimization

Default Value1
TypeInt
Description
For more information, see the “Optimizing the Performance of Quick Find Queries” section in the Optimizing and Maintaining the Performance of a Dynamics CRM 2011 Server Infrastructure.
Min Version 5.0.9690.2720
Max Version

Another one I assume has gone the way of the dodo for D365, this was initially configured to roll back the changes made to Quick find back in CRM 2011. I have toggle this on a new system and can not find any effect. This may be because of my minor data load etc and the optimisations that Microsoft has done since 2011.

EnableRetrieveMultipleOptimization

Default Value0
TypeInt
Description For more information, see the “Optimizing the Performance of Queries against Large Datasets section” in the Optimizing and Maintaining the Performance of a Dynamics CRM 2011 Server Infrastructure.
Min Version 5.0.9690.1533
Max Version

This is another one that is probably redundant for anyone not on CRM 2011, and I would think those that are administering these systems already know about these settings. There is a full table of the configuration for this setting in the linked document, tweaking the way (away from the default) for returning multiple records in a list.

As I do not have a CRM2011 environment and also don’t have the data volumes to see the effect, not sure I can or should add to this.

ExportedExcelRetentionWindow

Default Value5
TypeNumber
Description The number of days to temporarily store Excel exported Office Document Records. 30 days max was selected arbitrarily as this is only a cache. Must be at least 2.
Min Version7.1.0.1065
Max Version

Learning is great isn’t it? Researching this one led me to a great article from Bhavesh Shastri about the settings that may affect storage size, which has this setting in it.

Basically, when you export a data set into Excel, D365 stores the result in a temporary table so that you can get back to the result quickly. There is a deletion service that goes and tidies up these temporary tables and this setting defines how old these records are before it removes them. If you have a very active system, these records could adversely effect the size of your instance.

ExpireChangeTrackingInDays

Default Value30
TypeNumber
Description Maximum number of days to keep change tracking deleted record ID’s. You want this value larger than the max# of days any change tracking dependent services sync with your system. Default is 30 days.
Min Version7.1.0.0
Max Version

This is another storage related one like the previous, again jump over to Bhavesh’s blog to get more detail. This setting doesn’t appear in the Microsoft KB article where the others are, even more confusing and makes me wonder what else Microsoft are hiding

If you have change tracking services running and worry about deleted records, this setting can be configured to increase the time these records are stored. Obviously, this will impact on storage if nothing else.

ForceRetrievePublishedMetadataForRetrieveAllEntities

Default ValueFalse
TypeBoolean
Description All RetrieveAllEntities requests will ignore the AsIfPublished flag so that it always retrieves the published metadata from cache.
Min Version 8.1.1.1020
Max Version

I can’t find any information on this. Will update when I find out.

FullTextSpecialCharactersToRemoveFromSearchToken

Default Valuenull
Typestring
DescriptionThis allows an organization to remove certain characters from a fulltext search string.
Example: To remove a wildcard character from a FullText search, add “*”. To remove multiple characters, add them all together in a single string value “*.’#”. The characters are separated by ToCharArray
Min Version 8.1.1.1020
Max Version

I can’t find any information on this. Will update when I find out.

GrantFullAccessForMergeToMasterOwner

Default ValueTrue
TypeBoolean
Description When two records owned by the same team are merged, the final record is shared with the owner of the record it was merged from. This creates redundant POA records, as a result, if the owner of the record is changed in the future it will be visible to team members of the previously owning team. To do this, set to false.
Min Version 5.0.9690.4449
Max Version

Another one of those security options that most people don’t know about. I would hazard to guess that most people don’t know that the owner of a subordinate record in a merge has the new master shared with them. Mark Carrington has a great explanation of this setting and the next in his blog here

The default setting puts a row in the Share table for the original owner, even if it is same as new owner, which is useful in a lot of cases, but also could break your security model if the owner changes again.

With the setting put to false, the original owner does not have the record shared with them

The wording of this setting implies that only the master owner will have the sharing stopped, but it seems that the original does as well.

GrantSharedAccessForMergeToSubordinateOwner

Default ValueTrue
TypeBoolean
Description To turn this setting off, this must be set to false. Records are shared with inherited access to subordinate owners during merge. This will not occur when set to false.
Min Version 5.0.9690.3911
Max Version

If you have a contact and merge the account, the contact will be moved to be owned by the master owner, but the owner of the contact will have the master record shared with them too. Obviously, this just makes the system work, but doesn’t help if you have a specific security prinicple in place.

Please see Mark’s blog to fully understand these values.

HideStageAndUpgrade

Default Value1
TypeInt
Description Changing this setting to 0 will allow users to see the upgrade solution option in solution import wizard when importing solutions with higher version than previously imported. This setting has a default of 1 which hides the option to perform a stage for upgrade.
Min Version 9.1.0.3
Max Version

Back in 2016, Microsoft gave us the ability to stage an upgrade solution, which means we can more easily remove parts of a managed solution, rather that using holding solutions.

Nishant Rana has an excellent blog here that describes the process. The process relies on having the Stage for upgrade tick box available to you, which is now hidden by default presumably because Microsoft has provided us the ability to create patches for solutions and it is no longer needed.

HierarchyLevelForHierarchyFeature

Default Value3
TypeNumber
Description The hierarchy level used for hierarchal security
Min Version 7.0.0000.3027
Max Version

Another security tweak. There are 2 security models that may be in use in your instance, namely managerial or positional security.

Both can take into account a number of managers above you who have the same access to records that you do, which is traditionally 3, and anything above this can cause performance issues. As you can set the hierarchy level on the hierarchy configuration, this setting is now redundant I think, as any change you make to the org setting is not reflected in the setting on the Hierarchy setting screen.

IfdAuthenticationMethod

Default Valuenull
TypeString
Description Changes the request sent to the ADFS server, this settings default value
Min Version 5.0.9690.2835
Max Version

This setting is for on premise only, as it defines which SSO an internet facing deployment (an external facing on premise deployment) uses to establish the user has the right access credentials.

This blog has a more detailed explanation.

IncludeHTC

Default Valuetrue
TypeBoolean
Description Set whether forms should support HTML component.  The CRM 4.0 API uses this feature which is deprecated and will no longer be included after the next major release.  If you  have javascript code relying on CRM 4.0 javascript API’s you should enable this and work to update your code to support CRM 2011. This is also controlled via Settings | Administraton | System Settings | Customization. This setting is NOT SUPPORTED IN CRM 2013
Min Version 5.0.9690.3233
Max Version

This setting is only applicable to CRM 4, I know there are still users out there, but I can not get access to a system to try out this setting & assume even the CRM 4 people won’t be using this feature still

inheritedRolePrivilegesFromTeamRoles

Default Valuetrue
TypeBoolean
Description Disables the Azure AD Group team functionality of the organization in an event that there is a performance related issue. The Azure AD Group team feature is shipped enabled by default.
False – Disable Azure AD Group Team and members of group teams are required to have their own security role assigned to them directly.  Run-time calls to Azure AD to obtain the user’s AAD groups are stopped.
True – Azure AD GroupTeam is enabled and members of group teams shall inherit user/basic privileges directly and user privileges are derived at run-time.  Run-time calls to Azure AD to obtain user’s AAD groups are invoked.
Min Version 9.1.0.4632
Max Version

Earlier this year, Microsoft gave us the ability to give security permissions to all the members of an Azure AD group, rather than having to duplicate security across Azure and Dynamics. This can be a time consuming god-send for your administrative staff, but also can impact performance. Setting this value to false forces the old method of security, with each user having to be configured with a role in D365.

Debajit Dutta has an excellent walkthrough on how to utilise this functionality here.

IdsCountBeforeUsingJoinsForSecurity

Default Value1000
Typeint
DescriptionNot documented in optimization paper.
Min Version 5.0.9690.2720
Max Version

I can’t find any information on this. Will update when I find out.

IdsCountForUsingGuidStringsForSecurity

Default Value20
Typeint
DescriptionNot documented in optimization paper.
Min Version 5.0.9690.2720
Max Version

I can’t find any information on this. Will update when I find out.

IntegratedAuthenticationMethod

Default Valuenull
Typestring
Description Changes the request sent to the ADFS server, this settings default value
Min Version 5.0.9690.2835
Max Version

An On-premise property, which I can not find anything on.

JumpBarAlphabetOverride / JumpBarNumberIndicatorOverride

Default Valuenull
Typestring
Description How to customize the Alphabet Bar for the CRM Application Grids for Microsoft Dynamics CRM 4.0
Min Version 5.0.9690.2243
Max Version

I didnt know this was a thing, but only for CRM 4.0. Basically, you could override what appears at the bottom of a search display.

Shivam Dixit has a walkthrough here

ListComponentInstalled

Default ValueFalse
TypeBoolean
Description If CRM and SharePoint use ADFS and users click to create a folder for a record in CRM 2011, intermittently, the SharePoint page is shown instead of the list part grid page causing confusion with users. This setting allows you to force CRM to use the installed list grid component in SharePoint when using ADFS. false: use the standard method of detecting Sharepoint Authentication  -true: If CRM and SharePoint have ADFS enabled, force CRM to use the grid display.
Min Version 5.0.9690.3911
6.1.0.581
Max Version

This was a bug fix to force display of the SharePoint list instead of the page in a SharePoint created folder when a list is available.

LookupNameMatchesDuringImport

Default ValueFalse
TypeBoolean
Description Importing a solution that was created from an upgraded 4.0 deployment fails. Changing this setting makes the import solution look up the names for forms, views, workflows and security roles.
Min Version 5.0.9690.583
Max Version

I have no access to CRM 4 to create a solution to test this out, but if anyone can create me screenshots, I’ll add in.

MaximumChildUsersCountLimitBeforeUsingJoinForHSM

Default Value80
TypeNumber
Description Maximum Child/Subordinate Users Count Before Using a Join Query for Heirarchical Security Model
Min Version 7.0.0000.3027
Max Version

As mentioned, hierarchical security model gives your manager (or managers manager etc) the same access to your data that you have. As a manager with lots or reports, this can cause performance problems, so limiting the maximum unions it makes when deciding on your access makes sense. After this number is reached, an outer join will be used to improve performance.

If you have scenarios that require that number of child users to a manager, I would recommend not using Hierarchical security.

MinRowCountForFKIndexCreateInReferencingEntity

Default Value100
TypeNumber
Description Setting min row count in referencing table for ForeignKey index creation
Min Version 7.0.0000.3027
Max Version

In the D365 world, the back end SQL optimisation is out of an administrators hands mostly, with indexes automatically been created for Primary keys or alternative keys for example. This setting tweaks the creation of indexes for FK keys.

If an account has a foreign key to a location entity, there needs to be more than 100 rows (by default) before the system will create an index on the account table using the location foreign key.

OfficeDocumentPersistenceTimeInDays

Default Value7
TypeNumber
Description The number of days to temporarily store Office Document Records. 30 days max was selected arbitrarily as this is only a cache.
Min Version 7.1.0.1059
Max Version

When an office document is retrieved from the database, it is stored in the cache, by default for 7 days. This can be altered. This is one option that impacts the size of your tenant.

PageSizeForHierarchyFeature

Default Value5
TypeNumber
Description The hierarchy page size used for hierarchal security
Min Version 7.0.0000.3027
Max Version

I assume that this is used in the same way as the MaximumChildUsersCountLimitBeforeUsingJoinForHSM setting, but I can not find any documentation for it

ReassignAllExtendedTimeout

Default Value0
TypeNumber
DescriptionIncrease script timeout for reassigning all records of a user or team – this allows you to exceed the default extended timeout value. Default extended timeout is 1000000 ms (roughly 15 minutes). WARNING: Care should be taken when increasing this value above the default – always double check the number of minutes before setting this to a value higher than the default
Min Version 7.1.2.1020
Max Version

If you have a security model that relies on cascade of ownership, when a parent owner is changed, this ownership is also done for childred, and children of children etc. This can be a time consuming process which you can increase the time-out for, possibly temporarirly if you have an occasion that required it, though the default 15 minutes is a very long time.

RecordCountLimitToSwitchToCteSecuritySql

Default Value75000
TypeInt
Description For more information see the Optimizing the Performance of Queries against Large Datasets” section in the Optimizing and Maintaining the Performance of a Dynamics CRM 2011 Server Infrastructure.
Min Version 5.0.9690.2720
Max VersionI didnt know this was a thing, but only for CRM 4.0. Basically, you could override what appears at the bottom of a search display.

This setting is for On-premise only, impacting the switching to use a temporary table or CTE ( Common Table Expression ) to generate the results. I have not got enough data to check if this still has an impact on D365 environments.

The linked paper has a lot of good stuff about optimisations, that thankfully the more modern SAAS environments have taken care of.

RetrieveMultipleSharingCountThreshold

Default Value1000
TypeInt
Description For more information, see the “Optimizing the Performance of Queries against Large Datasets section” in the Optimizing and Maintaining the Performance of a Dynamics CRM 2011 Server Infrastructure.
Min Version 5.0.9690.2720
Max Version

This is another setting that is for optimisation of the on-premise architecture, defining the number of shared records the user has for the entity being searched for (either direct or via teams) that the system will convert the query to joins rather than table value functions.

As sharing with individuals should be an increment to your security model rather than the main part, having 1000 shares per entity would be pretty rare.

SecurityQueryHint

Default Value1
TypeNumber
Description Used to hint the query within GetRightsFromPrincipalObjectAccess. 0=None; 1=Recompile (default); 2=OptimizeForUnkown
Min Version 8.1.0.141
Max Version

I can’t find any information on this. Will update when I find out.

SharingLimitForPOASnapshotTable

Default Value10
TypeInt
Description Not documented in optimization paper.
Min Version 5.0.9690.2720
Max Version

I can’t find any information on this. Will update when I find out.

skipAadGroupObjectIdValidation

Default ValueFalse
TypeBoolean
Description Disables the validation of Azure AD Group objectID and allows application to create Group team in CDS.  This is used to mitigate the latency in the Azure AD distributed cache where a newly created Azure AD group cannot be validated if the subsequent Azure AD Group graph call goes to a distributed cache server that does not have the new Azure AD group yet.
False – do not skip Azure AD group objectID validation during Group Team creation.  This is shipped by default.
True – skip Azure AD group objectID validation to allow application to create Group team.
Min Version 9.1.0.5808
Max Version

If you use the new functionality to connect teams to AD groups, allowing a single location to manage security, setting up the teams needs a link back to the AD group.

As AD group creation takes time to propagate to the various caches, the group you want to link to might not be available, so, by default, no check is made to see if you have got the GUID right. This can be forced if you need that extra piece of mind by setting this value true.

SkipGettingRecordCountForPaging

Default ValueFalse
TypeBoolean
Description Disables the record count query. This query is responsible for retreiving the total number of records returned for each view. This query can cause longer search times and may cause SQL timeouts or exceptions.
False – Enables record depend on views.
True – Disables record depend on views.
Min Version 8.2.0.0503
Max Version

When you have a large dataset, knowing that you have thousands of records rather than just the 100 the view is showing is invaluable, but this query has to be done, as well as the query to retrieve the data, impacting performance

With the setting applied, the query doesn’t display an accurate number, but it does display a number, which is mis-leading in my opinion.

SkipGettingRecordCountForPagingForAudit

Default ValueFalse
TypeBoolean
Description Disabled the record count query for just the Audit entity. False enables the record count, and True disables the record count
Min Version 8.2.0.0503
Max Version

This is the same as the previous setting, but just for the auditing views, which are notoriously large.

SkipSuffixOnKBArticles

Default ValueFalse
TypeBoolean
Description Disables the suffix from being used on the automatically generated KB article numbers.
False – Enables the suffix on KB articles.
True – Disables the suffix on KB articles.
Min Version 5.0.9690.1992
Max Version

Pretty self-explanatory, but for old KB articles rather than the new Knowledge Articles.

TabletClientMaxFields

Default Value75
TypeNumber
Description Maximum Tablet Fields max-500/ min- 1
Min Version 6.0.0.809
Max Version

These tablet settings are not for D365, v9 as Unified interface has removed the limitations

TabletClientMaxLists

Default Value10
TypeNumber
Description Maximum Tablet Lists max-50/min-1
Min Version 6.0.0.809
Max Version

These tablet settings are not for D365, v9 as Unified interface has removed the limitations

TabletClientMaxMashups

Default Value3
TypeNumber
Description Maximum Tablet Mashups
Min Version 7.0.0000.3027
Max Version

These tablet settings are not for D365, v9 as Unified interface has removed the limitations

TabletClientMaxTabs

Default Value5
TypeNumber
Description Maximum Tablet Tabs max-50/min-1
Min Version 6.0.0.809
Max Version

These tablet settings are not for D365, v9 as Unified interface has removed the limitations

traceLogPersistenceTimeInDays

Default Value30
TypeInt
Description This sets the amount of time that TraceLog data is maintained before being removed by the Deletion Service
Min Version 8.1.1.1020
Max Version

This is another setting that can impact the size of your instance. The logs are stored by default for 30 days, if you have a heavily used system, this will be a lot of data, so reducing this will reduce costs. Trace logs are part of a separate pricing tier now, but still worth a look.

UseOrganizationServiceForMultiEntityQuickFind

Default ValueFalse
TypeBoolean
Description Allows Multi-entity Quick Find to run serially rather than in parallel. This allows plugins to be executed on RetrieveMultiple.
Min Version 8.2.1.0135
Max Version

Retrieve multiple plugins can do a little more security to prevent the display of records or adding in data to a record set if you need it to be really truly accurate.

Whilst this is frowned upon as is a real impact on system performance, it might be necessary. If you need this functionality, quick view doesn’t honour this logic as the queries run in parallel. Toggling this to true makes each quick query run in serial, allowing retrieve multiple plugins to work.

D365 Org DB Settings – Canvas App

On the back of one of my other posts on the D365 Org DB Settings I thought it would be good to re-imagine the method to update these settings in a Canvas app.

The solution from Sean McNellis is great and has been a big influence on my design, including the settings xml is pretty much a copy. I am hoping that I give users a more visual experience and this is a starter for a bunch of D365 CE admin apps in PowerApps.

I am now looking for people to test it and give me feedback on the solution, so if anyone has a little time to be critical, please contact me on the blog, via Twitter or on GitHub as an issue.

https://github.com/CooksterC/D365-Admin-Tools

Overview

Hope you like green….

On the left is a list of the options for configuration. By default, it only shows the ones that are already configured, but we can alter this to show all settings by toggling the top button. You can also search at the top for the setting you want.

The list is also colour coded, to highlight the ones you have changed, the ones that are configured already and the ones that have no setting.

The grid shows the official documentation of the setting and a link to the KB article.

The Bin button will remove this setting from your configuration, returning it to the default.

The + button will add the setting to your configuration using the default value.

The Value entered can be different controls depending on the value been configured.

At the top right, you can save any configuration changes, which refreshes the list and the bottom right panel, which shows the current configuration

The top panel will display the documentation on the settings. There is also the ability to copy a configuration from another system or manually edit the xml.

NO WARRANTIES GIVEN. If you get this wrong, then don’t blame me!

Installation

The package is a solution, with 2 components, the Canvas App and a custom connector.

As part of the app, the settings xml is cleared before it replaces with the new configuration, using the standard connector gives an error when this is done, so I reverted to doing this via a custom connector. Also, the settings are reliant on the version of your D365, less of an issue now that everyone is on the same version, but for those on premise, the version is required, again not available in a normal connector.

I have found that if I include that in a solution, it has the intended effect of bringing with it the connection to the api that it is configured for. This is great for deploying apps using a common api, but in this scenario, I don’t want you to use my tenant, but your own. Not sure how this could be improved, particularly for ISVs etc. Also, as the solution matures, everytime you import the solution, the connector needs reconnecting, which isn’t ideal.

So there is 2 parts, the Canvas App exported as a package and the connector.

Connector Configuration

Select New Custom Connector / Import as OpenAPI file

Put in your environment link here. This is mine, and the solution method of connectors copies these settings, thankfully it is OAuth, so unless you know my password (and this is a trial).

Next is security. Edit the OAuth 2.0 settings.

You need to enter the Client Id and Client Secret taken from the Azure AD authentication as well as the Resource URL being the address of your environment.I stepped through this when I created my first custom connector for LUIS integration.

Update the connector and move on to testing.

I have only got 3 actions defined here, keeping it to the actions I need for the application and WhoAmI, which is so simple, allows me to confirm the connection prior to worrying about syntax.

As this is OAuth, you need to configure a connection, which is a prompt for you enter your credentials.

Test the operation to prove you have got your connector up and running.

Install the Canvas App

Got to your apps and select Import package. Select the zip file from GitHub repository.

The first time you install, it has nothing to update, so you will need to change the action to “Create as new” and give it a new name if you don’t like it.

On the connector, you should connect it to the one that has just been created

Hit import and you are done! Run the app to see a default configuration.

Like I have said, I would like feedback, particularly on how to improve the installation process. It is rather fiddly, would love to be able to just install a solution and you are done.

IFTTT – Stopping Freezer melt

I have a problem, well my family has a problem. We can’t shut the freezer door. Regularly (once a month) our freezer would be left open just enough to defrost but not enough to sound the audible alarm, resulting in squishy food.

Annoying.

To combat this, I bought myself a Sonoff TH10 temperature sensor and configured this to send me an alert when it reached a certain temperature. The problem is I became blase to the alert, as it would go off as soon as someone opened the door or was not around to react. The module would only alert me once, when a warm temperature was reached, not keep reminding me or escalate to someone else when I didnt react by shouting at a teenager to close the door. So how could I ensure it alerted me less frequently and also alert the rest of the family when there was a real issue?

I know that this article is a little bit of fun with no real business benefit, but using Flow to fix a problem in my life is worth talking about. Everyone will have a little annoyance that Flow can help with, and it acts as a training exercise for us all.

Design

As with all problems, starting with a flow chart helps. This is what I need to achieve. The Sonoff device is configured to turn on and off when a certain temperature is hit. IFTTT is triggered by Sonoff on both of these conditions.

When the hot temperature is hit, wait 30 minutes (to allow for the temp change when a door is opened and closed naturally) then check if freezer is still is over temperature. If it is still hot, send out a notification.

Then, wait 60 minutes now and check again if the temperature is too high, before repeating the loop, sending out a further notification and waiting 60 more minutes.

If we check the temperature and it is now back down to cold, stop the process.

IFTTT configuration

If you don’t know the IF This Then That, it is a great service to interact with disparate systems, a point-to-point integration to do something on a trigger. I have used it previously to log Sales users calls from an android phone and I use it to change my phone’s background with the NASA wallpaper of the day. It is a free service which plugs the gaps where Flow does not have direct connections, such as with the Sonoff device.

Sonoff can trigger IFTTT, which in turn can trigger a web service, triggering the Flow.

Log into IFTTT and link your account to eWeLink (this is the app you install to manage the Sonoff switches).

Select Create from your account

This presents you with the standard IFTTT applet creator.

Hit the big + to display all the available services, eWeLink should be on that list.

Next, you get a long list of the options eWeLink make available for you, this is where you need to select the appropriate switch, for me just a single channel switch.

Next step is to select your switch, named by you when you configure it in the eWeLink app, and whether you want this to be triggered when it is switched On or Off. For me, On means that the Freezer has gone below -20 ° C. Off means it has gone above -17 ° C.

IFTTT then needs to know what to do.

Hitting the big + again lists the action services, the one we want is a WebHook.

Next, the action requires a URL, which is from our Flow, so skip to that bit to get the URL, we aren’t posting anything to this hook, just a trigger. You could secure this a little more to pass in a key to know it is you that is triggering it.

Flow Configuration

Our flow starts with a web trigger.

I am passing a delay to the trigger, though the initial call will not have any delay.

Next, initialise a variable for checking the freezer later as well as an ID for the SharePoint Item I will create later.

If this has come from the initial IFTTT trigger, then we assume the delay is 30 minutes. You could just update a delay variable, but as you will see, I want to do something different if I am passed a delay, the second time we call it.

In the yes branch, we now wait for the default 30 minutes.

Cancelling the Flow

Now, according to our design, we need to check the temperature to see if it has dropped back below our trigger point.

Ideally, you would call the Sonoff and ask for a reading, and I am sure more expensive IoT devices allow you to do this, but I am cheap. The Sonoff and eWeLink interfaces allow you to look at the temperature on your phone, but do not broadcast the temperature out to anyone apart from the App, just the on / off trigger. Checking the temperature is not an option.

What about creating another trigger to turn off this looping? Unfortunately, this isn’t possible; there is an idea on the flow community; you can vote for here. I would also like to be proven wrong, reach out if there is a way to cancel a flow run.

I started doing this by adding a file in a Google drive location (it could be SharePoint or OneDrive but wanted to expand my use of connectors) when the temperature went down, but this was inconsistent. I ended up getting multiple emails as the file create did not happen every time. I am sure I could sort this out, but wasn’t really the point of the flow.

I ended up with creating an item in a SharePoint List when the flow was first triggered, complete the delay and check this item to see if another process has updated the same item while I was in pause mode. If that item has been updated, just stop the Flow. If it hadn’t send the email alert and trigger the flow again, to repeat the process.

First, create the item in SharePoint, with the current time (this is just for my interest, knowing how many times it triggers the freezer temp)

The Id of the item that is created is copied so we can pass it to the next iteration of the flow if needed.

After the delay, check the item that was created for a cold temperature. Using a bit of odata filtering here to only return the item that was just created and only if there is no cold time.

This Scope useage is only to allow me to copy the contents to the other branch, saving me time. The copy function is really useful, if you havent seen it, why not?

Next, if any items are returned (the item that was created was updated with a cold temperature), the title is changed to Cancelled. This is not really needed, but gave me something to do in the apply for each. The boolean is set to true, meaning Freezer is now cold and this flow can cancel itself.

Out of the loop, check the value. If the previous loop did anything, it will be true, if not false. If the freezer is now cold, just terminate gracefully. If it has not updated the temperature, let it fall through to the next action.

The final part of the original check to see if it sent a delay count is to email. In this no delay sent path, an email is just sent to me. I am using Gmail here, just because it was a new connector.

Finally, call the same flow but pass in the delay we want and the Id of the SharePoint item that was created.

What about if it gets really serious?

Telling me if the freezer has been open is fine, it usually results in a yell to a teenager to shut the f**king door. What about if I am not in?

After the first 1/2 hour delay, if the freezer is still showing hot, I need to escalate, for me means emailing my whole family. In the No branch, I delay for whatever value was sent in, set a variable to pass on to the next iteration and do the same code as in the other branch, the difference is that I include more people on the next email.

Here, I email all my family, with a nice in your face subject and also hike up the importance. Hopefully, they won’t ignore this email like they ignore me.

Triggering when it is cold

Using IFTTT the same as I have done early on, if the temperature hits -20 ° C, the second trigger is fired. This updates any items it finds in the list without a cold temperature with the trigger time

And that’s it.

In my inbox, after I have taken the temperature controller out of the freezer for a little while, I get a series of Freezer is hot messages, also sent to the rest of the family.

Once it gets back in, the notifications stop.

And in SharePoint, there is a list of the hot & cold triggers, from natural door opening, thankfully

D365 Org DB Settings – Email

On a client recently I helped deploy Microsoft D365 App for Outlook. Unfortunately, the behaviour requested by the client and the default behaviour of the Server Side synchronisation was not aligned.

This led to long discussions with Microsoft about some of the DB Org Settings we could utilise to tweak the way SSS works. As I researched these settings and discussed with my colleagues and during networking events, it became clear they remain unknown to most developers and administrators. If your deployment needed to tweak the settings, then people knew, but it is not common knowledge.

Further, there seemed to be very little documentation on what each setting does, apart from Microsoft’s own information here.

This is a series of posts explaining each of the Org settings, understanding how it affects your environment. Hopefully it will raise awareness of these settings.

Bear in mind that there a lot of settings that I can not find any information on or have no experience on. This is frustrating for me, but assume someone in the community will push me in the right direction and I will update the page when I find out.

How to Change Org DB Settings

Sean McNellis has an excellent tool on GitHub here which allows interaction with the org settings via a solution in your environment.

One the solution is imported, selecting the solutions gives you an interface where you can check the current status of each setting and change them.

To change the default, you need to select the Add hyperlink. This creates an XML file including your property which is uploaded to your system to change the setting.

Selecting the Edit link for the attribute now gives you a popup window, where you can edit the value

Also, Sean has included a copy of the Microsoft description with each setting.

On the right side of the grid is a link to to the KB article that mentions the setting, though normally it is just to KB 2691237 which is the central list of all Org settings.

AddressBookMaterializedViewsEnabled

Default ValueTrue
TypeBoolean
Description Changes the way the CRM Client queries SQL CE
Min Version 5.0.9690.2903
Max Version

The description doesnt tie up with the title, and when you google it, this setting revolves around a previous setting, Disable MAPI cache, that was around in CRM 2011 and helped improve performance. Not a lot more I can say.

AllowPromoteDuplicates

Default ValueFalse
TypeBoolean
Description False= Does not allow for the promotion of duplicate records.   True=  Allows for the promotion of duplicate records.  This setting is NOT SUPPORTED IN CRM2013 as of build 809
Min Version 5.0.9690.3731
6.0.1.61
Max Version

Duplicates in D365 are not unheard of, but when Outlook sync comes in, there are more chances to get duplicates. When a user syncs an Outlook Contact,

AllowSaveAsDraftAppointment

Default ValueFalse
TypeBoolean
Description Setting value to true will provide the capability to create appointments in Dynamics 365 as “draft” without synchronizing with Exchange. Appointment form will have a “Save as Draft” command and a “Send” command, so that you can save, add details and update an appointment activity without synchronizing to Exchange. Default value is set to false to preserve existing behavior.
Min Version 9.0.2.2275
Max Version

This flag opens up the possibility that you don’t always get a meeting correct the first time and allows you to save a draft.

Ulrik Carlsson aka @CRMChartGuy from eLogic solutions has a great article about this attribute.

The default behaviour is that users are not presented with a Save as Draft option for appointments, when a appointment is saved, if you are using SSS it will synchronise to Outlook and send out invites to the attendees as normal.

If you change AllowSaveAsDraftAppointment to true, the users get a different set of buttons.

Both Send and Send & Close buttons behave like the original Save and Save & Close. They will initiate a server side sync one the record is saved.

The new button Save as Draft will save the appointment but not send it to Outlook etc. It adds a [Draft] prefix to the appointment header and also a new field on the appointment, isdraft, is populated with true.

Weirdly, this field can not be added to a form. It just doesn’t appear in the field list. You can add it as view filter criteria, but you can’t display it as a column in a view.

AutoCreateContactOnPromote

Default ValueTrue
TypeBoolean
Description Disables the ability of the organization to create contact records automatically when an email message is tracked in CRM. This option can also be disabled from the user settings area for each user.
False – Disables the automatic creation of contacts.
True – Enables the automatic creation of contacts.
Min Version 5.0.9688.583
Max Version

By default, when a user sets regarding within the Outlook app, if any email address is in the To / CC / BCC etc that D365 does not know about, it creates it automatically.

Most of the time this is fine, but consider when your business process requires a lot more data fields to be populated in the contact, this default process will create a contact that hasn’t got what your business needs. Forcing the contact creation away from this automation may be required.

Users have an option under their personal settings which will mimic this settings, but this does it for everyone.

Setting the AutoCreateContactOnPromote to false removes the option from users and no contacts are created automatically when emails etc are synced.

AutoTrackSentFolderItems

Default ValueFalse
TypeBoolean
Description Setting value to TRUE will result in Server Side Sync auto tracking of emails from Sent Items. This setting only applies if the mailbox is configured to track “All Email Messages”
Default value is set to False to preserve the existing behavior.
To enable functionality on the organization “AutoTrackSentFolderItems” should be set to True.
Min Version 8.2.2.0840
Max Version

This setting works in hand with the selected option under “Select the email messages to track in D365” option under Personal Options.

By default, sent emails are ignored, only picking up emails that arrive to send to D365.

By marking the AutoTrackSentFolderItems to true, sent items will also be tracked, from the next sync, not retrospectively

BackgroundSendBatchSize

Default Value10
TypeInteger
Description Sets the number of email messages to download in one batch for the BackgroundSend API.
Min Version 5.0.9690.583
Max Version

I can’t find any information on this. Will update when I find out.

ClientDisableTrackingForReplyForwardEmails

Default ValueFalse
TypeBoolean
Description Enables a user not to automatically track replies and forwarded email messages. Set this to “True” to disable tracking replies and forwarded email messages. 
 NOTE: This setting only applies to Dynamics 365 for Outlook (not Dynamics 365 App for Outlook).
Min Version 5.0.9690.2903
Max Version

When a user receives a reply to an email that has already been tracked, the reply will also be tracked by default. This is great for keeping the chains of emails all within D365. Unfortunately, this may lead to conversations being tracked that shouldn’t and give visibility to sensitive conversations – a manager receiving an email that was a complaint about a particular email that their report sent for example.

Whilst this is mostly a training exercise, it can be quite embarrassing and this setting stops that. It does mean that you could lose out on a part of a conversation and rely on the user to track a response separately.

This settings, as noted, only works with D365 for Outlook not the App for Outlook.

DisableClientUpdateNotification

Default ValueFalse
TypeBoolean
Description Setting DisableClientUpdateNotification to true will disable the outlook client from checking for newer versions
Min Version 7.0.0000.3027
Max Version

Only for the D365 for Outlook, will prevent the application checking for a new version. This will help if you are in a locked down environment and need the stability.

With the D365 App for Outlook, it is a constant deployment rolled out with other fixes by Microsoft.

DisableImplicitSharingOfCommunicationActivities

Default ValueFalse
TypeBoolean
Description Changing this to “True” will disable implicit sharing of records to recipients that are added to existing activities.
Min Version 5.0.9690.2903
Max Version

When an email, meeting, phone call etc. is created and an internal user is included in the recipients list, it shares the record with them. This allows the internal user to have visibility of the record within D365.

If your security model has an issue with this, then this implicit sharing can be removed.

Your model may restrict the visibility of activities depending on what record the activity is associated with in a team scenario. If the original recipient is no longer in the team, they should not have access to that information any longer. With the OOTB logic, this activity will still be visible.

This email will still be in the recipients Outlook, nothing changes to the visibility in exchange, it is just the visibility in D365.

DisableLookupMruOnOutlookOffline

Default ValueFalse
TypeBoolean
Description LookupMRUItems in UserEntityUISettings can cause a large data volume when going online, setting this to true will stop MRU’s from syncing back ONLINE
Min Version 6.1.0002.0106
Max Version

This one is not obvious to me and there is no information online. I’ll update when I find anything out

DisableSmartMatching

Default ValueFalse
TypeBoolean
DescriptionDisables the smart matching functionality and relies on the tracking token on the incoming e-mails for email tracking.
False – Enables smart matching.
True – Disables smart matching.
Min Version 5.0.9688.583
Max Version

Smart Matching is how Microsoft works out that the email you just sent in belongs to a conversation that you have already synced, hence it will sync that email when it comes in.

In System Settings, you have several options when it comes to matching. Correlation is the default, where it is using a conversation id on each email to match them. You can supplement this with a tracking token and/or smart matching.

Token is generally used for support scenarios, to ensure any replies to an email are tracked against the same case.

Smart matching does what it suggests, using keywords in the subject and an algorithm to determine if the email is linked to a previous conversation.

Using the DisableSmartMatching flag does the same as un-ticking the box on the system settings, where conversation id and tracking tokens are relied on.

DistinctPhysicalAndLogicalDeletesForExchangeSync

Default ValueFalse
TypeBoolean
Description Server-Side synchronization needs a mechanism to distinguish between Logical and Physical deletes of entities in CRM
False : No distinction between physical and logical deletes for exchange sync delete scenario
True : Physical and logical deletes will be distinguished for exchange sync delete scenario
Min Version 8.2.2.0840
Max Version

This is another where security takes over and users expectations can differ from the way Microsoft thinks it should work.

If a user has been invited to a meeting, and it is recorded in D365, a copy of that exists within D365 and Exchange. If, for whatever reason, the user loses access (reas access) to that meeting in D365, the default behaviour would be to delete the copy in Exchange. Makes sense, to keep those in sync.

With the DistinctPhysicalAndLogicalDeletesForExchangeSync set to true, lose of access to any activity does not mean that the activity is deleted in Exchange. Use this with DisableImplicitSharingOfCommunicationActivities to fully get control of activity access.

DoNotIgnoreInternalEmailToQueues

Default ValueVersion 5.0.9690.1533 to 8.2.2.1300: False
Version 8.2.2.1309 and higher: True
TypeBoolean
Description If you disable the “Track email sent between CRM users as two activities” setting, email messages from a CRM user to a queue are not delivered. Additionally, if a workflow rule sends an email message to a queue, email messages that are sent by the workflow rule are not delivered.
False – Internal email messages to queues will not be delivered.
True – Internal email messages to queues will be delivered.
Min Version 5.0.9690.1533
Max Version

This is used in combination with the Track emails sent between Dynamics 365 users as two activities available in the system settings

If you enable the Track as seperate option, normally any email from internal user to a queue mailbox will be ignored. This seems a weird consequence, but they have provided you with an override, so that these internal mails are not ignored.

EnableAppointmentBroadcastingForOutlookSync

Default Value0
TypeNumber
Description Setting for Appointment broadcasting for Outlook Synchronization
Min Version 7.0.1.121
Max Version

I can’t find any information on this. Will update when I find out.

EnableCrmStatecodeOnOutlookCategory

Default ValueTrue
TypeBoolean
DescriptionEnables Statecode data on contact sync
Min Version 6.1.0.581
Max Version

I can’t find any information on this. Will update when I find out.

EnableSssItemLevelMonitoring

Default ValueFalse
TypeBoolean
Description Setting value to True will enable a new dashboard accessible by users and administrators called Server-Side Synchronization Failures. This dashboard allows the owner of a mailbox to have information about all non-synched incoming/outgoing emails and also appointment, contact, and task (ACT) items. Information is provided for the reason items are not synchronized. Default value is set to False to preserve the existing behavior. You can use the ExchangeSyncIdMappingPersistenceTimeInDays setting to control how long the data for failed emails is retained.
Min Version 8.2.2.1661
Max Version

There is a dashboard available to admin already called Server-Side Sync failures, without this setting. Not sure what this does, as the dashboard seems to be available regardless.

ExchangeSyncIdMappingPersistenceTimeInDays

Default Value3
TypeInt
Description The number of days for which the ExchangeSyncIdMappings are to be persisted for failed emails. This setting is used in relation to the EnableSssItemLevelMonitoring setting. It is not recommended to increase this value higher than 7 days as it can lead to the table growing very large.
Min Version 8.2.2.2059
Max Version

This setting defines how many days of sync failures are kept, useful when you are troubleshooting, but table will get huge quickly, so only increase if you need to.

ExpireSubscriptionsInDays

Default Value90
TypeNumber
Description Max number of days before deleting inactive Outlook client subscriptions. We recommend you keep this to the default unless you absolutely need to change it, be mindful of keeping the tracking info too long, or deleting it too soon.
Min Version 6.0.0.0
Max Version

On creating my second post in this series I came across several configurations that were not documented in the KB article and hence were missed when I wrote this.

When you track an contact in Outlook, you are subscribing to changes made to those contacts in D365 so that they are mimicked in your Outlook. This is great, but each subscription is stored in a database record, hence impacting storage costs. There is a deletion service that works through the subscriptions and deletes these expired lines. after the value is reached, with an outlook Client refreshing it’s subcriptions as part of it’s sync routine.the

HideEmailAutoTrackOptions

Default ValueFalse
TypeBoolean
DescriptionDefault value is false, if it’s set to True: do not show the following track options in Personal Options (Email): ‘All email messages’, ‘Email messages from D365 Leads, Contacts and Accounts’, ‘Email messages from D365 records that are email enabled’
Min Version 9.1.0.1639
Max Version

This setting goes one further to the one below by stripping out “All email messages”, “Email messages from D365 Leads, Contact and Accounts” and “Email messages from D365 records taht are email enabled”, just leaving you with the two below.

HideTrackAllOption

Default ValueFalse
TypeBoolean
Description Removes “All email messages” option from users’ Personal Options under Email tab Select the email messages to track in Microsoft Dynamics 365 area.
False – “All email messages” option is shown in the dropdown.
True – “All email messages” option is not shown in the dropdown. If a user already has “All email messages” selected, their synchronization option is not updated in DB. Administrators will need to update this value via SDK.
Min Version 9.0.2.264
Max Version

Under personalisation settings for each user, they can decide to track all emails they receive from any source. Great for a shared mailbox or customer mailbox, but not for a normal user who receives spam and invites to cake sales etc.

The default here is Email messages in response to D365 mail, but to stop users filling your D365 instance, setting the HideTrackAllOption to true will remove that top option.

Any users that had this setting prior to it’s removal need to be updated manually or via the SDK.

MailboxStatisticsPersistenceTimeInDays

Default Value3
TypeNumber
Description If value is 0, dont store ANY MailboxStatistics Data, if the value is greater than zero then store that number of days statistics data. Max value arbitrarily chosen at 1 year, this generates at lot of data so 1 year should be plenty of time
Min Version 8.0.0.1088
Max Version

The Mailbox statistics records how frequently a mailbox is accessed and synced. This way, the more active mailboxes are synced more regularly. A mailbox that is infrequently used will be checked less regularly.

On a high user system, with SSS on, it can get populated quickly, so 3 days will normally be appropriate.

OutlookClientEmailTaggerEnabled

Default ValueFalse
TypeBoolean
Description here are 3 values for this Boolean setting – true, false, and NULL (which is the value when NOT set). True: Will override any and all client registry setting to True. False: Will override any and all client registry setting to False. NULL: If the setting is NULL the outlook clients will use whatever is in the registry of the client. TO SET THIS VALUE TO NULL YOU WILL NEED TO CLICK EDIT, THEN REMOVE THE VALUE TO HAVE IT DEFAULT TO NULL.
Min Version 7.0.1.121
Max Version

I can’t find any information on this. Will update when I find out.

OutlookSyncDataSubscriptionClientsBatchSize

Default Value100
TypeNumber
Description This setting is used to determine how many record changes (deletes, inserts, and updates) to send back to a syncing client for each request.
Min Version 7.1.0.1059
Max Version

I can’t find any information on this. Will update when I find out.

OverrideTrackInCrmBehaviour

Default Value0
TypeInt
Description When this option is Enabled, the ‘Track in CRM’ button functions as the Set Regarding button in Dynamics 365 for Outlook. In Dynamics 365 App for Outlook, ‘Track without regarding’ command is not displayed, with Set Regarding as the only way to synchronize Outlook items to Dynamics 365.
0 – Normal behavior of the “Track in CRM” button not having to set a Regarding record in Dynamics 365 for Outlook.
‘Track without regarding’ command is displayed in Dynamics 365 App for Outlook.
1 – The ‘Track in CRM’ button functions as the ‘Set Regarding’ button, and makes you select a regarding record in Dynamics 365 for Outlook.
In Dynamics 365 App for Outlook, ‘Track without regarding’ command is not displayed, with Set Regarding as the only way to synchronize Outlook items to Dynamics 365.
NOTE: This setting applies to both Dynamics 365 for Outlook and Dynamics 365 App for Outlook.
Min Version 9.1.0.6200
Max Version

Normally, a user can track an activity to D365 without associating with a record, the Set regarding. This could lead activities in your tenant not associated with a record, orphaned. Depending on your business requirements, disabling this feature could be required.

Normally, under the … under Not Tracked, the user has an option to Track without Regarding

Setting OverrideTrackInCrmBehaviour to 1 will override this behaviour, removing the ellipses button altogether. The user has to establish a link to an existing record to sync the email or activity.

OverrideV5SenderConflictResolution

Default ValueFalse
TypeBoolean
Description When multiple records with the same email address exist in the Dynamics CRM Organization and email is automatically tracked, the email address is resolved to the record for the owner record that was created first. This option lets you override that functionality.
False – E-mails are tracked to the first record created.
True – E-mails are not tracked automatically if there are multiple records with the same email address.
Min Version 5.0.9690.2243
Max Version

Michael Sulz has a good write up on this, here.

Normally, if there are 2 or more contacts with the same email address (data quality is always a problem, however much you take care of it, though data8 do a real good job of removing duplicates and improving your data) the contact chosen is the first contact owned by the syncing user, sorted by create date or the first created if that doesn’t match.

Setting this option to true will force the user to make a decision and not sync the email automatically.

RestrictIRMEmailItems

Default ValueFalse
TypeBoolean
Description Setting value to TRUE will result in Server Side Sync NOT synchronizing ALL emails that are marked as IRM emails.
Default value is set to False to preserve the existing behavior.
To enable this restriction on the organization ” RestrictIRMEmailItems ” should be set to True.
Min Version 8.2.2.0840
Max Version

Bhavesh Shastri has a great write up of this configuration here

Restricted messages, those that the sender has marked as any of the restricted types in Azure information Protection, may not be suitable to be included in your D365 system.

If you set his flag to true, the user will not be able to sync those that are protected and will be given an error message if they try to.

SecuritySettingForEmail

Default ValueFalse
TypeNumber
Description 1: Display a Warning Message And give an option to open – 2: Display a Warning Message and do not give an option to open -3: Do not display a Warning Message and do not give any option to open. This setting is NOT SUPPORTED IN CRM2013 as of build 809
Min Version 5.0.9690.3731
6.1.0.581
Max Version

The majority of emails that a user receives and hence sync to D365 contain HTML to some degree, whether it is simple formatting or full on marketing emails.

In all scenarios, the interface presents a stripped down version of the email, but formatting etc will be lost.

There is a risk when these are displayed in all their glory in D365, that parts of the email could be nefarious, including scripts etc that could include phishing or other attacks. Microsoft by default warns the user that this is the case, but allows the user to click through to the content, putting the decision in the users’ hands.

If you change the setting to 2, the link to the full content is removed

Changing the setting to 3 removes the message and always shows the full version of the email

SendEmailSynchronously

Default Value0
TypeInt
Description If you have a plugin registered on the email send flow, you should change this setting to “1.” 
0 – Email is sent asynchronously.
1 – Email is sent synchronously.
Min Version 5.0.9690.2720
Max Version

Depending on your logic, you may interact by workflow when an email is sent via Outlook. This setting moves the send email to a synchronous operation rather than asynchronous, allowing a more immediate interaction with the email. This may have a performance impact on the user in Outlook.

SortEmailByReceivedOn

Default ValueFalse
TypeBoolean
Description When the Activities tab of the social pane is show, the data ordered by the ‘modifiedon’ date in descending order, toggling this setting to True will enable the social pane to sort emails by RecievedOn Desc instead of modifiedon
Min Version 8.0.1.79
Max Version

I am not sure that this is a problem any more, in Social pane in D365 we have a lot of options for searching, but back in the legacy UI this allowed you to change the email sorting from the date the email was edited or added to D365 to the date the email was received. This could be several days difference, so it could give a different perspective to the conversation.

TraceExchangeSyncData

Default Value true
TypeBoolean
Description Enables exchange sync tracing
Min Version 6.0.0.809
Max Version

Logging of the sync data is essential for any troubleshooting, but it adds to the size of your database. With the separation of log and data in storage costs, I am not sure this should ever be turned off if you are using SSS.

TrackAppointmentFromNonOrganizer

Default ValueTrue
TypeBoolean
Description Enabled users to track appointments organized by another Dynamics 365 user via Dynamics 365 App for Outlook.
False  –  Dynamics 365 App for Outlook and Server-Side Synchronization users cannot track Outlook appointments whose organizer is a Dynamics 365 user.
True  –  Dynamics 365 App for Outlook and Server-Side Synchronization users can track Outlook appointments whose organizer is a Dynamics 365 user.
Min Version 9.1.0.0294
Max Version

You can always track a meeting if it was sent from an external user and by default you can track any appointment where the organiser is a D365 user. This setting prevents the user tracking an appointment if it is not them organising it.

TrackCategorizedItems

Default ValueTrue
TypeBoolean
Description Setting value to False will remove the category tracking flag and functionality.
Default value is set to True to allow category tracking and tracking status visibility for users whom do not use Dynamics 365 for Outlook or Dynamics 365 App for Outlook.
Min Version 8.2.2.0840
Max Version

Using Category based tracking is a great way to allow users to track multiple emails at once. In the App for Outlook, this is the only way.

With the OOTB behaviour, the user gets a new category added and is able to select multiple emails to sync. It also appears as a great indicator in Outlook that the activity is synced.

Setting the flag to false removes this category and ability.

Be warned on this, if you leave any item with the category on it after you have disabled this functionality, re-enabling the functionality will mean that these items will be synced. Also, this category doesn’t respect the fact you upgrade. An email with the category that was synced to an on premise version will create a duplicate if that user is moved to the online version and the originating email was migrated as part of the data migration from on-prem to online.

UseCrmOrganizerForEmptyExchangeOrganizer

Default ValueFalse
TypeBoolean
Description Use the CRM Organizer of an Appointment if the Exchange Organizer doesn’t exist.
Min Version 8.1.1.1020
Max Version

I think this is more to do with rare cases when the sync doesn’t work correctly, but another one that I can not find any information for.

UseFilteringMethodOfSyncingMailboxOnlyForCorrelation

Default ValueFalse
TypeBoolean
Description This is for controlling which users’ filtering settings will be used  for correlation. 
False  – filtering method of all recipients of the email will be checked to decide if any user/queue accepts email or not.
True  – filtering setting of user who synced email to CRM will be used. Filtering  setting of other recipients of the emails will be ignored.
Min Version 8.2
Max Version

Each user has a seperate filter list to decide which emails are synced to D365. These can be various settings on what that individual user requires.

The default for this setting, the standard OOTB behaviour, is false, where any user can sync this email if it matches their settings. True means that the user who created the email or synced it will be able to have the email included in the selection for the filter. It is in effect an additional filter for the user to only include emails I have created.

UsePlainTextForEmailTemplateBody

Default ValueFalse
TypeBoolean
Description Changes the Email Template to use plain text where otherwise text with the following symbols would not appear <text>.
Min Version 5.0.9690.2720
Max Version

This is one of the older settings, presumably when people had email clients that could not handle html formatted text.

MS Certifications – Sharing my Mind Maps

Over the last 15 months, I have completed 5 Microsoft certifications and have used various sources on the internet to get me through.

My method of remembering facts relies on completing a mind map. This way, I note down the key points (key to me anyway) for a subject and link them together, serving as a quick refresher on the day but forces me to fill out my knowledge as I research the content and develop my understanding.

Please don’t think that memorising these maps will be all you need to do to pass any of the exams. I share them to assist you in getting that certificate and maybe trigger something that you don’t understand so you can research that area more effectively. I also don’t guarantee that they are all accurate. The vagaries of time and my ineptitude will ensure they are not.

It goes without saying that practice is the best way of ensuring you have the right understanding of the product, but I have found that there are always areas where you have not come across, even if you have been working on an application for years. Microsoft never stops developing.

I will keep this page up to date as I complete more exams with the mind map and any resources I used, with the exam I took most recently at the top.

All the mind maps can be found here. I use Freeplane as my mind mapping tool of choice, because it is open source, free for unlimited maps, allows tweaking of styles etc and is not trying to sell you it’s bigger brother. There are others out there which offer increased functionality, but this works for me. You can import Mindmap files into most other services.

MB-210 – Sales

Mind Map

Sales is my bread and butter, with lots of experience in what the life-cycle is etc, but the new parts, Sales Insights was new.

Again, I used Neil’s blog. Great insight, though I would also get an understanding of the minimum requirements for some services, which I wasn’t ready for.

The scenarios are full on too, though the advise is to read it through before tackling a question and re-read the sections before giving your answer

MB-240 – Field Service

Mind Map

Field service was a stretch goal for me, as I have never used it in anger on a client site. It is a relatively new product in the suite, but one that is getting a lot of traction in the D365 community.

I went into this exam thinking I was not prepared, but managed a decent pass with my learning. The exam itself is quite short, but did stretch my understanding.

My first resource, as always was Neil Parkhurst. He has a great series of posts on this exams predecessor, MB2-718 Whilst being slightly dated, it is still relevant.

Further, I went through the Microsoft Learn topics on Field Service here

MB-200 – Customer Engagement Core

As I have been working with D365 a while, I was very lazy when going into this one. I just assumed I knew it all, brushed up on the new stuff and went for it. Thankfully, this didn’t bite me in the ass.

I didn’t do a map for this one, but I reviewed the map for MB2-716 here.

I also review Neil’s posts on this one and @NZ365Guy videos on the subject on the OpenEdx course which are a great, particularly for the new stuff in the application, like Flows.

AZ-203 – Developing Solutions for Azure

Mind Map

I consider myself to be a lapsed developer. I love coding, but my career has taken myself into more of an advisor and designer than a “coder”. This exam was proof to myself that I could still run with the cool kids and also exposed me to a lot of the azure stack that I did not know.

As I was preparing for this exam back in January, to get an Azure Developer Associate certification, it swapped from 2 exams to 1, AZ-200 being the first, that is why they references to AZ-200 linger.

For this exam I was indebted to the course by Pluralsight which I am thankful that I had access to.

From a D365 developer point of view, this was a tough one. It was a step above and beyond what I expected. There is so much content, each area to a decent understanding level it really taxed me. This is the first exam to push the time I had available to complete it, with a large number of questions and a lot of deliberation.

MB2-718 – D365 for Customer Service

Mind Map

This was the last of the older exams I took before Microsoft revamped the exams. Neil Parkhurst has an excellent blog on this one, which was my source. I have been doing service for quite a while, but the intricacies around SLAs and Queues were something that I had to learn. It also contained Unified Service Desk and Voice of the Customer, both subjects I had not come across.

MB2-716 – Customisation & Configuration

Mind Map

Another old exam, and something I take pride in as it is my bread and butter. This was another re-cap of my understanding, particularly around the bits where I would just Google when I came across it in my day job. Neil Parkhurst again provided the detail which saw me through. Bits on auditing and configuring email etc. were items that I knew the fundamentals, but Microsoft has a knack of slightly tweaking the wording giving a separate answer, so it is vital you know your stuff.

MB2-715 – D365 Online Deployment

Mind Map

I am fortunate to have come to D365 after online was the chosen deployment method. I have not got nightmares like some older community members around installing and configuring on-premise solutions. This is another exam that is part of my day-to-day role so it was just a matter of brushing up on where I did not have enough knowledge.

Neil Parkhurst (who doesn’t owe Neil a beer?) has it covered again. There is a lot you take for granted here, that you need to get to grips with, such as licensing, what you need to do in the Office Admin portal vs D365 admin, Email configuration and integration with other apps like SharePoint.

Cloning Flows: Location triggers for everyone

Sometimes ideas don’t work out. This is one of these times. But the reason I blog is to learn, expand my knowledge of the PowerPlatform, expand my knowledge of components outside of it. So, I figured I would blog about my failure, learning is learning. As I started testing the flow again, moving environments etc, it started working. I guess this is down to the location trigger being a work in progress. Moral of the story: If it is broke last month, try again this month.

Back in July, I started working on this scenario, but couldn’t get it working. I noticed @Flow_Joe_ & @JonJLevesque did a video walkthrough of using the Geofence trigger to send out a summary of the customer when a sales person enters an area, which reminded me of my failure, hence why I have written it up. While from Joe & Jon’s video shows us how easy it is to create a flow, for Salespeople in general, I think this is too far. You can not expect a Salesperson to have any interest in creating flows to do this, you can expect them to click a button on a form within their D365 application.

Objectives

  • The Scenario
  • Creating the Flow button
  • Cloning the Flow
  • Outcome

The Scenario

Numerous times when I have been a customer, a salesperson would come to us not knowing that we have several major cases logged with them against their product. This is mainly down to lazy sales people (I know, they don’t exist), but it would be awesome for the salesperson to get a summary of the account when they get in the door of a customer. The number of support cases, a list of the open opportunities and orders, any complaints that have been logged. All of this information is available to the salesperson via the D365 mobile app, but it would be good to ensure that they get this information and are less likely to get caught out by a customer venting at them for 5 critical bugs that have been sat around for a month.

The Solution

Flow has a new trigger, still in preview, Location, which is triggered via the Flow application when an user enters or exists an area. This is perfect for our scenario, stick a GeoFence around a customers location, when the user enters the area, it gets triggered. Look up the Customer, format an email and send it to the user.

Flow is user friendly, a low code solution, but you can not expect a salesperson to create a flow for each account they want to create this trigger for. What can be done, is put a button on a form, automatically create a Flow for the user against the account they have selected which would then be triggered when the user enters the location.

There are 2 separate series of flows that are required, firstly to start with an action from the user on the account record, which triggers cloning of a template.

The second series is the clone of the template, which triggers sending the salesperson the relevant information when they enter the customers property.

Creating a Flow Button

Starting with a CDS “When a record is selected” trigger, configure it to be used when an account is selected.

The next step is to retrieve who is running this flow. As mentioned, it will publish this button on a Account form, so it is essential to know who is running this, so an email can be sent to them. The account information and who the user is is sent as the body to a HTTP Post trigger, which is the next flow in the chain.

An HTTP trigger is used because the next Flow requires enhanced access. An admin user needs to clone a Flow, which you would not want a normal user to be able to do. The admin is used as well to ensure any runs that happen are legitimate. The admin or sys account shouldn’t belong to someone who could have Flow in their pocket.

To have the URL to send to, the next Flow needs to be created first, but just to show where this button appears within the D365 interface. The first time we run it, there are few confirmations that you need to do, finally you can run the flow.

Cloning the Flow

This flow clones an existing template, tweak it slightly and gets it up and running as the user.

Starting with an HTTP Trigger, I use a sample payload to build the schema.

Next is retrieving the account. As the account id is passed in from the calling Flow, a simple Get Record is used.

Next, configure the name of the Flow that will be created, making it unique for the user by adding their email address in. A flow definition string is also initialised for later

In this Flow, the user that called it from the button is needed, so it retrieves the profile using the Office 365 Users action.

Next, retrieve my template flow. Flow has several actions around management of Flows, which are incredibly useful to a Flow administrator. The template flow is a simple flow which has a location trigger and a call to a http trigger to call a secondary flow. I will discuss later the detail about this.

The next couple of actions try to determine if a flow with the FlowName defined already exists, firstly by getting a list of all my flows (as an admin) then getting a list of Flows in the organisation, then filtering it with the flowname that was defined in the initial steps

If there is a flow already, just stop. If not, carry on & clone the template flow.

The Template

The Log Template is a very easy, small location trigger with an HTTP call action. The HTTP call passes in the user’s location and the account id and the user who started the process. Both email and account will be swapped out as part of the clone.

The trigger region is essential for any location trigger. It triggers this one of the Microsoft campus in Redmond. Someday I will be fortunate to go to the motherland. I chose this as it is not likely that the user would have them as a client, but it doesn’t really matter where you chose, as what you need is the latitude and longitude from it so you can replace it when you clone the flow.

If you click on the peek code button against the trigger, it shows a JSON representation of the trigger. The latitude and longitude are that of the Microsoft office and this is the bit I need to replace

Cloning the Flow (part 2)

All a Flow is a JSON file. Obviously, how it is rendered and how the hooks and actions work are the power, but the definition is a JSON file. Using this knowledge, we can create a new version of the template, with a location specific to the account.

The template in all it’s glory is below. Just using simple find / replace, we tweak it to the specific location, account and user.

{
  "$schema": "https://schema.management.azure.com/providers/Microsoft.Logic/schemas/2016-06-01/workflowdefinition.json#",
  "contentVersion": "1.0.0.0",
  "parameters": {
    "$authentication": {
      "defaultValue": {},
      "type": "SecureObject"
    }
  },
  "triggers": {
    "manual": {
      "type": "Request",
      "kind": "Geofence",
      "inputs": {
        "parameters": {
          "serializedGeofence": {
            "type": "Circle",
            "latitude": 47.64343469631714,
            "longitude": -122.14205669389771,
            "radius": 35
          }
        }
      }
    }
  },
  "actions": {
    "HTTP": {
      "runAfter": {
        "Initialize_Email_Variable": [
          "Succeeded"
        ]
      },
      "type": "Http",
      "inputs": {
        "method": "POST",
        "uri": "https://prod-68.westeurope.logic.azure.com:443/workflows/<GUID>/triggers/manual/paths/invoke?api-version=2016-06-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=<SIG>-JQQvYT0",
        "body": {
          "lat": "@{triggerBody()?['currentLatitude']}",
          "long": "@{triggerBody()?['currentLongitude']}",
          "user": "@{variables('Email')}",
          "account": "@{variables('accountId')}"
        }
      }
    },
    "Initialize_Account_Variable": {
      "runAfter": {},
      "type": "InitializeVariable",
      "inputs": {
        "variables": [
          {
            "name": "accountId",
            "type": "String",
            "value": "<accountId>"
          }
        ]
      }
    },
    "Initialize_Email_Variable": {
      "runAfter": {
        "Initialize_Account_Variable": [
          "Succeeded"
        ]
      },
      "type": "InitializeVariable",
      "inputs": {
        "variables": [
          {
            "name": "Email",
            "type": "String",
            "value": "<email>"
          }
        ]
      }
    }
  },
  "outputs": {}
}

Back on the clone flow, the next step is to convert the template to a string. This makes it easier to replace the latitude, longitude etc. with the ones we want.

On the account OOTB record there is a latitude and longitude. This data is not normally populated, but it is used by Field Service and other applications. I used Field Service to populate it using the Geo Code button.

As you can see from the above, Field service populates both latitude and longitude to 5 decimal places. This is common precision when you use any mapping software such as Google. so I am not sure why if you do the same by the Flow trigger you get precision to 15 dp for latitude and 17 for longitude.

The next 2 steps are because of me trying to get the flow to work. One of my thoughts was that the flow was expecting the all 15 of the decimal places to be populated, so these steps pad out the number you have against the account with additional numbers.

The expression is the same for both

concat(string(body('Get_Account')?['address1_latitude']),'111111')

The next step replaces the newly calculated values for latitude and longitude in the JSON definition

replace(replace( variables('flowdefstring'),'47.643434696317136',outputs('Replace_Lat')),'-122.14205669389771',outputs('Replace_Long'))

The accountid is also replaced. This is used in the cloned flow to define which account the user selected. The trigger only gives you the user’s current location, not the centre of the circle you configured. You could use these values & find the account, with difficulty, unless there is something I am missing. I prefer to add a variable in the clone, which is the account id.

replace(outputs('Replace_Lat_Long'),'<accountId>',triggerBody()?['Account'])

The same with the email to send to, it should be the user who triggers the geofence, but seems to be the person who is the admin. As I clone the Flow with an admin account then add the user as an admin, it runs under the admin account.

There is enough info now to create this flow. Using the Create Flow action, the new flow is created and up and running.

I use a JSON expression to convert the string I have used to find / replace the latitude, longitude etc. to ensure the Flow is created with JSON.

json(variables('flowdefstring'))

The final step is to add a Flow owner. As the sales person who triggered the flow is who it should trigger on, make them the owner, so it should run under their context.

Outcome V1

Ignore this bit if you want to avoid the author moaning about stuff that doesn’t work.

If I run the whole flow, I do generate a new Flow.

Going into what was generated, using peek code again, you can see that the Microsoft location has been replaced with the Avanade office

The trigger is active, but this is where it stops. I can not get this to trigger to fire. Changing the location to my home, going for a walk, coming back doesn’t trigger it.

If I don’t put in the padding for the latitude and longitude, it doesnt trigger.

If I clone from my location, not changing the latitude and longitude, still the trigger doesn’t fire.

If I configure a new trigger from scratch, that works.

Everything about the trigger look the same when you get it in code view, but there must be something different.

This is where I started reaching out, I tweeted about it to the gods of flow and asked in the Flow forum where I did get a response, basically saying the same, and that the location trigger is in preview.

So, if you have got this far, how do I fix it?

Outcome V2

Like I said at the outset, this didn’t work for me. Frustration set in, and I forgot the idea. But, as I was putting together this blog post, re-deploying the components as my demo system had expired, it worked!

So, moving on, we need to sent an email to the user with the playbook for the account. I want to list the last 5 critical cases, last 5 open opportunities, last 5 notes and any description the user has put in.

It triggers an HTTP request, the schema defined by a sample payload, but contains who triggered the workflow and which account.

Then, a great time for a parallel branch. The Flow retrieves the cases, notes and opportunities in a parallel branch.

Each branch does a similar thing, looking at the Notes branch, firstly retrieve the records with a CDS List Records action, using an OData filter and order by, return the top 5 only.

Next, put this in an HTML table, selecting the output from the Get Notes action. I select Advanced option, then Custom columns, this way I can define the order and which columns I want to display.

The final step is to send an email

Obviously, this can be customised to your business need, but my example list the cases, opportunities & notes, and reminds them to fill in a contact report.

Summary

So, the user selects a button on an account form, which allows them to receive updates about one of their customers when they enter the location of the account. Easy.

I tested this with my home address and with a different user and you can see that I get the email through. Veronica is in the US, I wasn’t up at 1am writing blogs & fixing Flows.

You can also see that Flow notifies the user that it has made them an administrator on a Flow.

This Flow starts with a Flow button on a record, making it a user-initiated process. It could be triggered off a record creation process – If the user follows an Account, create this automation for them, as long as they have opted in.

There is location tracking in the Field Service application, but that requires the Field Service mobile app and not suited to a Sales person. They just need to install the Flow app on their device and forget it is there.

AI Builder – Text AI

My blogging journey started with using LUIS, one of Microsoft’s Cognitive Services to automate case assignment. This blog goes into detail about how this all hung together, using a model defined in LUIS, calling the LUIS endpoint when a new cases are created and classifying the case, by the subject, with the result from the call.

After my summer break (sorry, but family etc comes first) I thought I would revisit this scenario, but using one of Microsoft’s shiny, new AI Builder capabilities, Text Classification AI Model.

Objectives

  • The Scenario
  • Training your Model
  • Getting Data
  • Publishing the Model
  • Using the Tags

The Scenario

In my first blog, I went through the scenario, so not wanting to repeat myself, but for the lazy who don’t want to click through…..

Big Energy is a supplier of energy products to end users. They have a call centre which handles any query form the customer. As a perceived leader in the sector, it is always wiling to use the latest technology to allow users to interact with them, which reduces the pressure on the customer support centre.

Big Energy has a mail box configured to accept customer emails about anything and, rather than have a group of 1st line support employees filtering out and categorising the emails based on the content, want to use cognitive services to improve the process of getting the email (the generated case) to the right team.

Using AI to file the case

LUIS does a great job of this, with a BA providing sample utterances for the model and training it.

Text Classification AI Model does it slightly differently. The model expects users to provide data (in the CDS) in the form of text blocks and representative tags for the data. Both need to be in the same entity in CDS.

On a standard Case record, the classification or tag is the subject field. This is a parent record of Case and the tag would be the name of the subject. As subject and case are separate entities, the Text Classification AI model will not work. A field, be it a calculated one, has to be introduced to enable the classification AI to work. Adding data to an entity from a parent entity breaks my Third Normal Form training (anyone remember that? Is it still a thing?).

I have raised this issue as a new idea on the PowerApps ideas forum, go there and give it a vote!

The new logic for our AI model is that the AI will classify the incoming case, adding a tag. This will trigger a flow, changing the subject of the linked case accordingly. This will trigger re-routing of the case like it did in the original LUIS method.

Training your AI

With any AI model, it needs training. The AI needs to separate the wheat from the chaff. Creating a model is simple in PowerApps.

Start at make.powerapps.com and select AI Builder, then Build

There are 4 options here

Binary Classification is useful to give a yes / no decision on whether data meets certain criteria. The criteria can be up to 55 fields on the same entity. For example, is a lead with a low current credit limit, high current account value, no kids but has a pink toe nail (shout out to Mark Christie) likely to get approved for a new loan?

Form processing is intended to assist users in automated scanned documents to prevent re-keying. An example would be any forms hand written as part of a sales or service process (before you convert to a PowerApp obviously).

Object detection assists in classification of items, be in types of drink, crisps or bikes, etc.

Text classification decides on a tag for a block of text, for example, a user could enter a review of a product online and text classification could understand what product it was for or whether it is a positive review.

All 4 of these have origins in the Cognitive services provided by Azure, LUIS being the big brother of Text Classification.

Ensure you are in the correct environment. Text Classification only works on data within your CDS environment, so don’t expect to reach out to your on-premise SQL server. There are ways to bring data into CDS, not in scope for this discussion.

Selecting Text Classification displays a form to give you more understanding, and it is here that you name your model

Hit Create and then Select Text. This will list all your entities in your CDS environment (in my case, a D365 demo environment).

Select the entity you want, Case for our PoC.

The interface will then list all the fields suitable for the AI model, namely anything that is a text field. I chose the description field, which is typically the email that the user enters when emailing in a case to the support department.

Hit the Select Field button and it will present you with a preview of the data in that table.

The next screen is to select your tags. This needs to be in the same table, and as already discussed, is a bit of a limitation to the AI builder. Less normalised data is more common in Canvas apps or SharePoint linked apps, but for structured data environments with relationships and normalised data this is a limitation that will hopefully be removed as Text Classification matures.

Also, option sets are not available, again another common categorisation technique. Multi-select option sets are an ideal tagging method too. Assume that this will come in time.

For my PoC, I created a new field, put it on the Case form and started filling it in for a few records.

Select the separator. If your tag field contains multiple tags, separated by a comma or semi-colon, this is where you configure it.

It also gives you a preview of what the tags the AI build would find using your chosen method. You can see in the No separator option, “printer; 3d” is one tag, rather than the assume 2 tags as displayed if semi-colon is selected. This depends on your data.

The next page displays a review for your data and the tags that the AI builder finds.

Next, select a language for the text field dependent on your data.

Once selected, train your model. This is where I started to run into problems. My initial population of tags was not enough. The training came back quickly with an error. There should be a minimum of 10 texts per tag, which I didn’t have. That would be hundreds of rows. How was I going to automate creating data to give the Text AI enough data to be a suitable demo?

Getting Data

I need thousands of records to train my model properly, thousands of records relevant to the tags I create. No online data creator seemed suitable, as it wasn’t specific enough, so how? A flow.

First I created a column in the Contact table to store a number for my contact, a unique no so I can randomise the selection of a contact.

Next, I need some data for the case description and the tags. This is already done as it is the same as the utterances and intents I used for LUIS, so I exported the LUIS configuration, put the data in an excel file & added a number to that.

Ready for the Flow

My simple flow is described below.

Ask for the number of cases to create, keep creating cases until you have reached that limit using a random contact and a random description.

This flow is triggered manually so I start with a manual trigger and also prompt for the number of cases to create,

The Subject variable is used later to define the reference for the subject we have chosen.

The default for loops is 60. I realised late on in the day that you can change that, but breaking up loops is good practice, to limit the scope of what could go wrong, so created a loop within a loop structure for my flow.

I restrict the inner loop to 50 loops maximum, which means the number of times I run this loop has to be calculated. If I want a 920 cases created, my outer loop would occur 45 times, each creating 50 cases. I would then do a final set for the rest.

The next steps will initialise some counters used in the loops. I also want to ensure that if the user wants to create less than 50 records, the outer loop doesn’t run at all.

The outer loop will run for the number of loops I have calculated. This is the loop condition. The counter increments as the last thing in the outer loop. The first thing in my outer loop is to reset the case counter. This is the counter for the 0-50. If we are in this inner loop, at least 50 cases will be created.

The first thing it does is get a random contact by using a odata filter to filter on the number field created specifically using a random number from 0-875 (875 being the highest number in that table).

Once the contact is retrieved, find a random description / tag combination. The data from the LUIS utterances is held in an Excel file on a Teams site. Again, a rand() function takes a random number up to the maximum in that file.

Because more than one subject row could be returned and the fact I don’t like apply to each inside each other, set the subject Id variable.

Ready to create a case now.

Nothing special. It also populates the tag field.

After some testing, to ensure that the case has the necessary fields entered, the flow was run for a thousand records without an issue.

Creating data this way isn’t quick, 20 mins for 1000 records, but it is easy and allows you to bring in reference data quickly. Superb for PoC environments.

Training your Model (with data)

Once this data is generated, it was time to re-train my model. It ran through with success this time.

The model is 97% sure that whatever I throw at it, it should be able to match it against the tags. There is a quick test option here too, which allows entry of a sample phrase to check your model

All ready to publish.

Publishing your Model

Publishing the model allows it be used within Flow and PowerApps.

Clicking Publish generates a child table of the entity you first chose where the predictions are stored. The documentation states the table will be TC_{model_name} but it created mine with gobbledegook.

The link on the form helpfully allows you to go straight to the entity in the new customisation interface, where you can change the label of the entity.

Also, it is useful to change some of the views, particularly the associated results view. By default it includes name & date, which is pretty useless, so add the tag and the probability.

As this is a child table of Case, it is by default visible in the case form Related navigation item. In the classic customisation interface, you can change the label of this view.

As it is published, users can use flow and the Predict action to predict the tag for a given section of text, useful if you want to do stuff before it reaches an environment.

Now that it is published, you need to allow the model to run. This means it runs every time there is a change to the text field. This is all done via Flow, so will use your flow runs. It stores the result in the new entity.

If a case is created now, it automatically creates the tag secondary record.

Using the tags

As AI builder generates a record for you with its prediction, and the data is in CDS, it is a simple Flow to utilise that. As it creates a record in the AI Tags table, update the corresponding case to change the subject accordingly.

Simple trigger when a record is created. The first action is to find the subject from the tag.

Update the case record with the subject and the tag so the AI can be retrained later.

That’s it. Replacing LUIS with a more user friendly environment is definitely a tick in the box for Microsoft. The AI in PowerApps feels like a simple, user friendly stepping stone for a lot of businesses into the AI world. Hopefully, businesses will embrace these simple models to leverage tools to shortcut processes, improving Employee and customer experiences.

User Admin – Published App

After being asked on LinkedIn to publish both the apps that I built for the User Security Admin walk-through I have done so on Dynamics Communities Power Platform Bank

Stand-Alone Security / User App

The first one, which if you remember is a stand-alone application detailed here can be downloaded here

https://dynamics365society.uk/powerappsbanklist/dynamics-ce-security-user-profile-powerapp/

Embedded Security App

The second app was the one after I converted the original to a embedded form in the User record, detailed here

https://dynamics365society.uk/powerappsbanklist/dynamics-ce-embedded-security-powerapp/

Both apps require the custom connector to read and update teams and role, which are included in the package.

Please let me know if it works for you via Twitter or LinkedIn

Thanks goes to Those Dynamics Guys for the great Dynamics Community & the PPB as well as Joergen Schladot to giving me the kick up the arse to get it done.