Sunday, April 21, 2024

Power Automate license enforcement - looking at it from a Dynamics 365 CE perspective

Mid May 2023, a warning popped up in Microsoft Message Center regarding "Non-Compliant Power Automate Flows" -  soon after the message however disappeared so this got probably missed by the majority of Dataverse and Dynamics 365 admins.

This however raised some concerns in the broader Microsoft community ( see Will Power Automate enforcement licensing kill your flows?  and Upcoming licensing enforcement in Power Automate explained ). 

To be honest I did not pay a lot of attention since the message apparently vanished in thin air and after consultation with Microsoft support they said that this message was sent prematurely. But then, end of October another warning popped up.


So it seems that Microsoft is finally cracking down on Power Automate flows which are not associated with a properly licensed user for premium connectors or Power Automate flows not directly linked to a Power App. When you built your own model-driven app on top of Dynamics 365 Customer Engagement (CE) which uses Power Automate flows, you will need to associate the Power Automate Flow with with your new app.

There is a PowerShell script to identify the flows which at risk to be turned off  across your tenant - see I have many environments - how can I get the flows that need my attention across tenants in the Power Automate Licensing FAQ - which uses the Get-AdminFlowAtRiskOfSuspension cmdlet

The Get-AdminFlowAtRiskOfSuspension cmdlet is part of a separate PowerShell module which you can install using Install-Module -Name Microsoft.PowerApps.Administration.PowerShell. It will run a scan of your environments and outline


Check out Associate flows with apps - Power Automate | Microsoft Learn on how you need to link up a flow with an app (see below screenshot on where to do this in the Power Automate flow detail screen). If you make this change on a flow which is a part of a solution, then the associations will be part of the solution file and can be transported cross environments.


Related articles/blog posts:

The ABC of AI: Retrieval-Augmented-Generation (RAG) and grounding

This is the first in a series of blog posts about more advanced generative AI and Large Language Model (LLM) concepts which I use as notes to myself (check out Why I blog and you might want to consider it as well)

Retrieval-augmented generation (RAG) is an AI framework that enhances the quality of responses generated by large language models (LLMs). LLMs are trained on a massive amount of data and understand statistical relationships between words but lack true comprehension of their meanings. So when faced with specific questions in a dynamic context so that is where RAG comes in.

RAG integrates information retrieval into LLM answers by using these steps:

  1. User inputs prompt: when you ask a question, RAG uses your input prompt
  2. RAG retrieves relevant information from an external knowledge base based on the user prompt
  3. RAG combines this external content with your original promt creating a richer input for the LLM
RAG and grounding are related concepts in the context of enhancing LLMs. Grounding is the process of providing LLMs with information about a specific use-case. RAG is one of the techniques which is used for grounding (another technique is dense retrieval - see Dense X Retrieval: What retrieval granularity should we use  for more details). 

Microsoft Copilot offers a good example of grounding and RAG in use. The reason why Copilot is able to give more targetted responses, is because it uses grounding to improve the specificity of the prompt. 

Copilot uses Microsoft Graph, which can retrieve information about relationships between users, activities and organizational data (like info in Power Platform/Dataverse and/or Dynamics 365, info from e-mails, chats, documents and meetings) as part of the prompt grounding process. Microsoft Copilot will use the user prompt and additional info retrieved through Microsoft and then sends it to the LLM. For more details see Microsoft Copilot for Microsoft 365 overview


The most common systems to provide external data for RAG LLMs are vector databases and feature stores.

References:

Monday, April 15, 2024

Dynamics 365 and Power Platform monthly reading list April 2024

Copilots, AI and machine learning

 

Technical topics (Configuration, customization and extensibility)


Topics for Dynamics 365 Business Applications Platform consultants, project managers and power users


Friday, February 23, 2024

SQL Server Integration Services Project template available for Visual Studio 2022

 Since end of 2022, there is also a SQL Server Integration Services  Project template available for Visual Studio 2022 which you can install from the Visual Studio Marketplace. You can install it from the direct download link here or you can search for it in the Visual Studio 2022 extension manager and install it from there.



Thursday, February 22, 2024

Classic Azure Application Insights deprecated on February 29th 2024 - 7 days to go

 If you missed it - classic Azure Application Insights will be deprecated on February 29th 2024. If you missed the different notification e-mails, you can quite easily see the warning if you navigate to an Azure Application Insights resource in Azure Portal.


Migration is actually quite easy - you just click on the link provided and this will open up the menu depicted below which allows you to associate your Azure Application Insights resource to a Log Analytics Workspace. The good news is that there are no pricing changes when moving to the workspace-based model. 




As indicated in the migration window, this is  a one way operation so plan for it in advance - the points below might impact on how you will do the migration:

  • You can link different Application Insight resources to a single Log Analytics workspace or you can make the split - in most case you want to consolidate it.
  • Instrumentation keys do not change during the migration so you don't need to worry about this
  • The export feature is not available on the Application Insights workspace-based resources - you need to look at diagnostic settings for exporting telemetry
  • There might be some schema changes - important to consider when doing KQL queries - check out query data across Log Analytics workspaces, applications and resources in Azure Monitor
  • Existing log data will not immediately move to the Log Analytics workspace - only new logs generated after the migration will be stored in the new log location.


Tuesday, December 26, 2023

Running SSIS packages in Azure Data Factory - scaling and monitoring

Lifting and shifting SSIS packages to Azure Data Factory (ADF) can provide several benefits. By moving your on-premises SSIS workloads to Azure, you can reduce operational costs and the burden of managing infrastructure that you have when you run SSIS on-premises or on Azure virtual machines. 

You can also increase high availability with the ability to specify multiple nodes per cluster, as well as the high availability features of Azure and of Azure SQL Database. You can also increase scalability with the ability to specify multiple cores per node (scale up) and multiple nodes per cluster (scale out) - see Lift and shift SQL Server Integration Services workloads to the cloud

To lift and shift SSIS packages to ADF, you can use the SSIS Integration Runtime (IR) in ADF. The Azure SSIS-IR is a cluster of virtual machines for executing SSIS packages. You can define the number of cores and compute capacity during the initial configuration (Lift and shift SSIS packages using Azure Data Factory on SQLHack)

Even though there is Microsoft article which explains how to Configure the Azure-SSIS integration runtime for high performance, there is not a lot of guidance of how to run it at the lowest possible cost but still being able to complete the jobs. So would you recommend a higher sizing running on a single node or running a lower sizing on multiple nodes? Based on experience, it seems perfectly possible to run most jobs on a single node and up until now we have been running all of them on a D4_v3, 4 cores, 16GB Standard. If you decide to run it on a lower configuration, it would recommend monitoring failures, capacity usage and throughput. (See Monitor integration runtime in Azure Data Factory for more details)



Reference:


Wednesday, November 29, 2023

Dynamics 365 and Power Platform monthly reading list November 2023

 2023 Release Wave 2

Technical topics (Configuration, customization and extensibility)

Copilots, AI and machine learning

Topics for Dynamics 365 Business Applications Platform consultants, project managers and power users