Power Platform | Maker Assessment App

In my previous article I was talking about risk mitigation and introduced the Maker Assessment app that is part of the CoE Starter Kit publicly available through GitHub. This application can be fully customized and consists of a user part and an admin part that are both canvas apps. The solution package also can be installed with Dataverse for Teams, so every Maker should be able to use it by an assigned Microsoft 365 license with Power Apps enabled – of course depending on what kind of customizations you might add to it.

Maker Assessment App – Welcome Page

Today, I´d like to share more around user experiences for both apps, but before going into this, I´d like to point out what risk mitigation shouldn´t be – A blocking, 10 pages and more, overengineered process, that shuts down any innovation business users are looking for. I am using an example that I use quite often during my talks & workshops.

What would have been, if Leonardo da Vinci or The Beatles needed to deal with risk mitigation in place that was implemented the wrong way?

A challenger
Leonardo da Vinci – Ball Bearing 1498 – 1500

Let´s assume he would have been forced not to use a canvas, paint or pen & paper? Would we´ve still seen his great inventions that are still in use today like the ball bearing? Imagine mechanics without ball bearing. Ball bearings can be found in nearly any machine that has rotary motion.

The Beatles – chord variations

Another – completely different example – I am using is about the chord variations Sir Paul McCartney and John Lennon from The Beatles used in songwriting partnership.

What would have been, if someone would have forced them to avoid any variation, instead go with the well known chord progressions of those period of time. Would they´ve sounded any special and created one of the most iconic chords to ever be strummed from the moment it was played?

A Musician

When implementing a risk mitigation for your citizen development inside your company and re-use apps like the Maker Assessment app, keep the processes behind it simple. Ensure, Makers can onboard easily, feel empowered and part of a community. Ensure sponsorship of innovative ideas. Create a place of imagination. Below you could find the questionnaire that a Maker is walked through in 5 easy steps.

Maker Assessment App

Depending on what the Maker selects or provides as an answer, to the right hand side there´s an info panel that offers immediate tips. You may ask how to adopt this concept, but make it work for your company? Those questions can be customized using the Maker Assessment Admin App.

Maker Assessment Admin App

This app allows your Center of Excellence Team to adjust the questions, answers and add possible additional information, such as important things to keep in mind or governance considerations.

Maker Assessment Admin App – Answer details

As you can see from above visual, the Admin app allows to select any of the 5 out of the box available categories and provide additional questions or modify existing questions. Inside the Answers sections you can click on the + to add new answers or click on the pen icon to edit an existing answer. This shows the dialog above where you could use toggle controls to indicate some important constraints and provide additional governance information. This can be saved in Dataverse or Dataverse for Teams by a click on the Save button. But what about, if your governance process has different categories or additional one´s?

Maker Assessment Admin App – Edit Categories

A click on the menu icon is here to rescue. It allows you to switch screen to edit categories, rename them, resort or even create a new category. In my day to day use, I found it quite simple to be used in terms of customizing the Maker Assessment app that is used by any Maker.

Maker Assessment App – Questionnaire

As outlined the Makers are guided through each category and would provide their answers. What I found difficult is the user to click on each category to move next or previous in between the categories – though a color indicator helps outlining where you are. But as said, I was under the assumption to click no submit. Though this button should be used only one time – when you´ve finished.

Maker Assessment App – Results

The Submit button navigates you to a results page that shows you the assessment results. You can Start Over again, by clicking on this button, Contact the Center of Excellence Team, which provides a small interface to generate an email, click on See details to seen an even more granular report around your assessment.

Maker Assessment App – Result Details

Above visual shows you this report. What I´ve recognized being valuable would be a print option – at least for a print out as PDF file. But as said, that is something you can easily customize and add to this solution. The Start building button from previous visual will walk you into the default environment where you could start building your app. In practice, you might want to add starting an approval from here, when the assessment report resulted in something that needs an approval – like a special environment setup or a DLP policy.

But, that´s up to you and your individual customization. The component is offering you a good starting point for this. Until then, …

Power Platform – Risk Mitigation of Citizen Development

I´ve been part of many governance, security and works council briefings regarding the Power Platform components. We´ve seen analysts from Gartner also talking about „How to Define and Guide Citizen Development Practices“. While each of its components is worth taking into account separately, I´ve always recommended to follow the platform approach and talk about a platform enablement instead of an individual technology or product enablement.
The reason for this is the deep relationship between the Power Platform services, sometimes even dependencies that would cause trouble, if you go with a product by product approach instead. Not to forget the deep relationship with Microsoft 365 these days where several features already could be used by having assigned a Microsoft 365 license.

Challenge: Name the Power Platform capabilities in total, without talking deep technical phrases.

When it comes up to prepare any briefings – like for instance an internal works council briefing – I encourage folks to first setup a common understanding of each product included main purpose and the related challenges that might occur using any of those services. The visual below shows a summary of recent work around this.

Power Platform challenges & main purposes

But that´s of course not all. Everyone should fully understand why Governance, Security, Adoption, Training and Nurturing is important. A good starting point for this is offered inside Microsoft´s documentation. One way to look at this is asking yourself why citizen developers would be a new generation of risk managers in your company?

How to educate and support them without blocking their engergy, enthusiasm and ideas around building something for better productivity or digital transformation. In the following visual, I tried to outline why there´s a new generation of risk managers in terms of using Power Apps and of course, you could run this exercise for every single product insight Power Platform.

Power Apps – citizen developers becoming risk manager

As a result, you would hopefully agree, that this is nothing special that comes with Power Platform. Instead, it´s been there with any „low-code“ development. So using Microsoft Excel and VBA Macros for instance, is something to consider having the same constraints and rules. While following a low-code platform approach though, we do want to take into account a simplified and secured process that allows everyone of us to start development as soon as an idea is born.

Asking yourself: How to?

Coe Starter Kit – Maker Assessment App Process flow

Above visual outlines a process that could be implemented based on the Center of Excellence Starter Kit – Nurture components, specifically the Maker Assessment. The process outlines a simple approach of ensuring that your company compliancy rules are not violated, when starting a new low-code project. A CoE Team working in the backend of supporting each citizen developer to become familiar with standards, rules and regulations, though ensuring there´s no blocker or over-complicated approval process in place to start working on an idea.

In the past, I´ve seen awesome customizations around this component, that I am not able to talk about or share in more detail. But the fantastic part of all those was seeing companies using the Power Platform capabilities to help automating and simplifying the process, by (for example)

  • Setting up internal community landing pages
  • Customizing App Assessment Questionnaire to fulfill internal needs
  • Sharing and provide guidance on compliance standards and regulations
  • Ensuring an agile development, while backend processes ensure company standards fulfilled
Maker Assessment – Data questionnaire

As an example in above visual you could see a current questionnaire around data used in a project idea that you might come up with. The guidance to the right could be fully customized, same as any question and equip each citizen developer to meet compliancy standards or find support by the resources that needs to be involved, such as your Data Security Officer for instance or Works council depending on the project you build with low-code platform.

The following visual shows a simplified report, each assessment will result in. With the capability to customize this, including further processes, you could imagine how this could enable each citizen developer to handle risk mitigation and stay compliant to company standards, without feeling a painful process being in place, that would stop them from following-up on their original idea.

Maker Assessment – Result Details

If you haven´t seen Center of Excellence Starter Kit in total, ensure to check out the video series. Keep in mind that the purpose of this is a quick start – customization expected. I also like to motivate to provide your feedback using the GitHub repository. And if you´re new to the Power Platform and ask yourself around adoption framework, please read about the maturity model provided by the Power Platform CAT (Customer Advisory Team) here. You´ll find tips and tricks shared that are the result of conversations with customers like you.

Until then, …

Power Platform | Bundle as key ingredient

Bundles have become key ingredients in large enterprise digital transformation journey. Analysts from both Gartner and Forrester have been talking about orchestration technology to power cloud initiative. And you maybe have seen bundles available in terms of Power Platform license guide where for instance Power Automate flows can be used in the context of an app with a Power Apps license given – just to name an example.

When talking about business value assessment and key differentiation of Power Platform vs. other vendors, most of the time you could still suprise with a statement like

A key different of Power Apps as a low-code application creation tools is – it is part of a platform approach!

unknown developer

In the visual below, I will try to explain what is meant by this and why it forces a lowered maintenance effort in total and increases efficiency in digital transformation.

Tool orchestration regarding Dev Tools and App modernization

When you make a decision on low-code tools these days, you should reconsider your decision steps. First in terms of your DevOps strategy being scattered or more centralized. Second, in terms of your transformation looks for quick wins or a long-term more agile story. This reflects into instant savings or new business empowerment in regards to business outcomes.

Taking a look on the left side, you might feel familiar with a couple of developer tools typically used over the course of time in your company – decisions to use them made in the past. Today, you therefore feel somewhat in the era of mutiple dev tools being used and questions upcoming like – „How am I going to integrate these services?“ When it comes up to orchestrated dev tools as a platform you might step into your current vendor strategy lacking. This is where your maintenance effort increases and you´re looking for a simplification.

Forrester and Gartner called this the orchestration of tools as a platform to lower the maintenance effort, increase efficiency and allow for more agile digital transformation.

Taking a look at the right, you see how Power Platform + additional services could help with and shows what orchestrated tools as a platform could look like driving your cloud initiative(s). What remains true on the app modernization side with the help of pro developer tools and low-code tools coming together, also remains true on the automation side. See following.

Tool orchestration regarding Hyperautomation

Here you can see what a strategy could look like transforming your actual process automation strategy into a hyperaumotation strategy. The term raised in October 2019 by Gartner providing top 10 strategic technology trends for 2020. In the above visual you find examples for individual or multiple process automation tools and on the right you see benefits outlined in an orchestrated platform with Microsoft´s Power Platform. Again, those services being easily combined with additional services. So another challenger for a comparison of Power Automate vs. other automation software could be:

„A key different of Power Automate? It is part of a platform approach.

a process advisor
Tool orchestration regarding Business Intelligence

I´ve been recently part of an enterprise-customer making a decision on cloud intiative, driven by data they already collected over the years. As you can imagine, what works with visuals provided for automation and app-modernization earlier, also works in a combo with business intelligence. Depending on your digital transformation journey, there´s a couple of benefits you will get from an orchestrated platform + using additional services easily like outlined in the top right corner (again, just providing examples here).

Could you imagine that decisions based on data can be drastically simplified with above approach?

You may ask yourself – what happens in an orchestrated plaform approach? Well, before we´re getting there, let me finalize by adding the fourth pillar inside Power Platform, which is intelligent assistant. Have you heard about Omni-channel? You thought about this being a good way of increasing customer loyalty and satisfaction?
Struggled in the past, due to the tools being used. That decision made caused them to be integrated and integration became the challenge in your current project?

Microsoft´s Power Platform and Power Virtual Agent might be an answer to it, combined with additional services as outlined here from the Azure or Dynamics 365 world.

Tool orchestration regarding intelligent Assistance

Okay, I guess with all the visuals provided so far, you already understand how tool orchestration in a cloud initiative could successfully be implemented. You´ve seen examples driven from:

  • App modernization
  • Process Automation
  • Data Analytics or Business intelligence
  • Intelligent assistance

Nevertheless, you´re not yet fully convinced this all could work inside your company to lower maintenance effort, increase efficiency and digital transform. Furthermore, this not only drives low-code or citizen development, instead is for all developers?

Tool orchestration regarding data sources

So let´s add the last mile on benefits, which is again in regards to data sources. You absolutely should feel familiar with the visual above showing to the left the amount of data sources that may be used inside your company, growing over the time. Each of them adding a challenge in your future data strategy. Need examples?

Well what about a GDPR „Please forget“ request? Would you be able to proceed this request within 72hours timeframe? What about a master-data management approach? Tried this in the past but integration remains a challenge and big data never got to a mature level? Could you easily access business application data, like from a ticketing system inside your intelligent assistant composer?

Above you see Microsoft Dataverse in an orchestrated data platform which by far doesn´t mean, let´s get rid of all the task fitting data sources and build a big (fits all and everything) database. Consider it as helping you with orchestrating by integration capabilities, such as virtual entities, a common data model, and developer extension. Want to know more about this, cause you thought Microsoft Dataverse is just another database? Please follow this. In a previous article I also outlined a comparison between Microsoft Dataverse and Azure SQL.

In the end, all of these visuals provide conversation starters to talk about „orchestration as a platform“ and set the foundation for your ecosystem on business operations. Having questions? Please leave a comment, or use Twitter to comment or reflect on this.

Until then,…

Power Platform | Dataverse or Azure SQL or both?

„We are the creative force of our life, and through our own decisions rather than our conditions, if we carefully learn to do certain things, we can accomplish those goals.“

– Stephen Covey

At the beginning of this month, I shared an article around a typical upcoming question regarding automation tools. Today, I’d like to share some insights about another question that is pretty common before start creating apps with low-code tools like Power Apps and the need for a robust and scalable database to store all the app data collected – using Azure SQL or Microsoft Dataverse or both?

Which tool for the job? Azure SQL or Microsoft Dataverse

Let´s get started with above visual outlining the main purpose and intention of both. No surprise, same as last time you mainly need to drive a decision around IaaS, PaaS or SaaS. Agree, Disagree with the short summary?

What I´d like to ensure when a question like this is upcoming, that we all share the same common definition around what we´re talking about. Therefore, we should clarify what kind of Azure SQL we’re talking about. Too many times we’re calling out Azure SQL and have something in mind, but in fact there’re multiple options both on IaaS and PaaS side. Let´s take a look.

A unified SQL portfolio

From previous article you already know about the level of support and management you would need to be responsible for comparing IaaS and PaaS offer with SaaS. When someone in a Design Thinking workshop would say:

Hey, I’d prefer to go with Azure SQL for creating new apps, or beginning app modernization…

I wouldn’t say no, but I would challenge the impact of such decision made early stage and ask for knowing the details already. For those new to Azure SQL, Microsoft is offering a course via Channel 9 which covers Azure SQL for Beginners in six compact modules with great intro and demonstration videos. I recommend taking a look into this, even if you´ve been working with Azure SQL for a longer time – a refresh of learning can´t be wrong. I won´t cover all of the content of this course today, instead focus on principles you will find inside Microsoft Dataverse as well. Those are:

  • Setup, configuration and deployment
  • Role-based access control
  • Managing and monitoring security
  • Auditing
  • Data Encryption
  • Dynamic Data Masking
  • Maintenance
  • Backup and restore
Dieses Bild hat ein leeres alt-Attribut; sein Dateiname ist image003-4.jpg.
Setup, configuration and deployment

My first question would be around knowing the amount of tasks it takes in comparison of setup, configuration and deployment between Azure SQL and Microsoft Dataverse – a question of IaaS, PaaS and SaaS specifics. When performing a business value assessment, this is pretty important even if this effort could be reduced. But many times I see those facts being missed inside a business value assessment or simply ignored due to these tasks already been completed due to a different project the Azure SQL infrastructure was needed for.

A second question would be around security and role-based access control to granular shape the access on data.

Azure – RBAC

Of course the amount of data collected would grow over time; instead of a single app, multiple apps may require access to same data, such as sharing contact information across multiple apps. Visualizing data with Power BI or performing some analysis of data might cause additional user access in future.

Monitor and manage security

Therefore, continuous monitoring and management of security would be recommended. Above visual outlines some typical tasks you would find in a checklist when preparing your Azure SQL governance concept. Next would be a question around Auditing. Depeneding on which kind of applications you´re going to create, this feature can become handy – assuming multiple users would be able to edit a single record.

Auditing in Azure SQL

Above visual again outlines some checkpoint items on your governance concept that you would setup. If you´re interested in all the details around this with demos – again, please make use of above Channel 9 course. Our next topic is something typical when talking about data in the cloud – Data Encryption. Below you can find some valuable insights around this topic.

Data Encryption

Next, in many of app scenarios there´s sensitive data that depending on rules users are granted access to while others shouldn´t be able to access or visualize the data. Azure SQL in this case offers Dynamic Data Masking. So a question from me would be around use-cases that might cause this specific situation and how to deal with it. Below visual shows the role of a normal end user and a data officer. You can see the differences inside Social Security Number and email address.

Dynamic Data Masking

Almost at the end, we will find two additional topics. First a question regarding maintenance around performance. Again, something that carefully should be considered when performing a business value assessment. Below visual highlights the two main ongoing streams. Depending on scale and Azure SQL infrastructure there may be more to be considered in an enterprise organization. Don´t underestimate this effort, as slow-performance on accessing the app data might cause a bad app user experience that ends up in not using the app or finding workarounds.

Maintenance for performance

Last, but not least for a desaster recovery concept there´s our last question around a backup and restore concept on Azure SQL. Again, effort that should be recognized in business value assessments comparing IaaS, PaaS and SaaS. Some insights around this can be found in the following visual.

Backup and restore

You may now ask, why I did outline all these principles around Azure SQL? This is to provide you a guidance when comparing the options of Microsoft Dataverse (SaaS) or Azure SQL (IaaS, PaaS). You can find many of above listed principles inside Microsoft Dataverse as well – though feature names could be different. Take a look at the following visual and focus on Environment Lifecycle in regards to Deployment. Move next to Security + Compliance for RBAC, Auditing, Data Encryption and Dynamic Data Masking (Field Level Security). Then return to Environment Lifecycle in regards to Backup, Copy and Reset for Disaster Recovery. Are you surprised of seeing similarities?

Let’s shortly deep dive and take a closer look behind the scenes – what’s behind Dataverse? Dataverse is a SaaS offer that is „backed“ on Azure and can be categorized in compute & storage + events & extensibility. No surprise, inside the storage you do find Azure SQL Elastic Pool. So is Microsoft Dataverse the low-code canvas of Azure SQL? Well Dataverse is way more than a database only. It is a service on its own. Dataverse stores Table Data in Azure SQL. But it also stores Table data in Azure Storage, Cosmos DB, Azure Data Lake and Cognitive Search – all in an intelligent way without you being in need of making a decision. All this is exposed via T-SQL, OData- and REST API.

Dataverse – what´s behind?

In addition and as you would expect it from a SaaS offer, it scales with demand without being in the need of taking care of the infrastructure behind this. You need more power regarding some jobs or reports – Dataverse will handle this for you. You need more power reagrding the storage – Microsoft Dataverse can handle this for you as well. Of course it may cause you to invest in a capacity add-on from a licensing perspective in this case, but no invest into additional infrastructure. Asking yourself where you can learn more about Microsoft Dataverse? A learning path could be found via MS Learn. And don´t forget about the events & extensibility, which will allow you to build specific integration scenarios that might be needed for your apps. Additionally, take into account based on customer feedback, Microsoft recently extended Dataverse for Teams capabilities to ensure your quick start.

Does by saying all this – Microsoft Dataverse should be in favor of Azure SQL? Of course not, there´re reasons for using Azure SQL and there´re reasons for using Microsoft Dataverse. In fact, I´ve seen many enterprise organizations using both. Plant control towers in manufacturing industry that are heavily related on Azure IoT Services, Azure Digital Twins, Data which needs to be visualized and analyzed + causing actions – for instance an onsite repair service that should be triggered automatically based on thresholds.

Building modern apps

Above visual outlines the power of Azure SQL to build modern apps, both in cloud and on-premises scenario. Adding Microsoft Dataverse to it, simply evolves this story and offers another option for modernizing your app infrastructure while allowing an easy extension with low-code tools offered via Power Platform. The final decision is up to yours.

Until then,…

Power Platform | Microsoft Teams & Approvals

With all the news around Microsoft Teams during Microsoft Ignite and before, you might came across the following challenge. When uploading a file to a specific Teams channel you tried to select a Power Automate flow you or someone else created and shared with you via „…“ and the resulting UI was something like this:

Microsoft Teams – Files – Context Menu options

So you asked yourself where is my Power Automate section within that context menu? If you click „Open in SharePoint“, it will bring you to a screen like this:

SharePoint – Selected File – Context Menu options

Here, you can find Automate inside your context menu and many more options. There´s obviously a design gap we´ve identified between the SharePoint and the Microsoft Teams layout presenting files.

Solution 1:
Assuming you´ve implemented an approval request for a selected file your way of „triggering“ this event would be a 4-step action that way. Select the file within Microsoft Teams, select Open in SharePoint from context menu, select the file in SharePoint UI again, select the approval request flow from the Automate context menu option.

Note that this solution is only possible when using the Microsoft Teams web or desktop experience. From the mobile client, the options inside your context menu from within Microsoft Teams mobile app are different.

As you can see this solution comes with a switch between two applications and in many cases, you might not want to switch between clients. So what´s the next best thing?

Microsoft Teams – Files tab with a folder in it

One thing you could consider is to use the Move context menu option. In my case, I´ve created a folder called Init Approval.

Solution 2:
Whenever a user is going to move the file to this Init Approval folder, the request approval flow is going to be initiated for that file. It will start the approval process and ensures that the user knows about the progress. How? Let´s take a look into the details.

SharePoint – Preparing a custom column

First, I´ve created a custom column that hosts my approval status. Whenever a user is uploading a file to the file repository within Microsoft Teams, this column shows „Not initiated“ as default to allow the user recognizing the current status.

Microsoft Teams – Approval pending

Whenever a user moves the file to the Init Approval folder, the approval process kicks-in, changes the status to Pending and moves the file back to the original main folder. That way, the Init Approval folder should be clean, not containing any documents unless someone created a file directly inside that folder. The user though don´t need to jump into this folder to get the approval status from the moved document to init the process. They can take a look at the column inside the main folder next to their document.

The beauty of this solution is, it doesn´t cause any additional licensing as all these actions can be done with the Power Automate seeded licensing inside Office 365 / Microsoft 365 licensing.

The downfall of this solution is, it takes a folder to be „mapped“ to the approval flow. With a couple of condition clauses, I could certainly use one folder to initiate different kind of flows, but it would be tough for users to understand which kind of flow kicking-in at which time, unless you would train them.

Another downfall of this solution is, again it only works inside the Microsoft Teams Desktop or Web experience. Moving files in the Microsoft Teams mobile app – as of writing this article – would cause users to use another app for this action – OneDrive for Business or SharePoint mobile app.

You might ask, what´s the trick behind the trigger as the move file can´t be found as a trigger inside the SharePoint connector. Using the create file trigger inside the SharePoint connector won´t trigger the flow to run, as we´ve not created a file, we moved a file.

Microsoft Power Automate – Approval request flow logic

What I did was using a Recurrence trigger, this uses the Get items SharePoint action with a Filter Query as listed in the visual above to only react on those documents. The rest of the approval flow logic is up to you. [I will not outline the various options for creating approval flows here, as you can find many examples via a simple research]. I would recommend starting with a template like this and modify.

To be fair, another downfall of this solution is certainly the recurrence pattern, which causes API calls to be consumed over the day, as it continuously looks for „jobs“, even if there´s no file moved into the folder. So we could implement the flow to terminate, if there´s no document found that matches the Filter Query statement, but it will still consume API calls.

Additionally, a couple of other cross-checks might be considered. What about documents being directly created inside my Init Approval folder? With the current logic given, their „My Approval Status“ would be „Not initiated“ as well, as I selected this to be my default value. But that was out of scope for todays article, as I only wanted to outline the out-of-box capabilities without jumping into other licensing constructs. But wait, there´s more.

Solution 3:
Take a look at the announced improvements on the Approvals application inside Microsoft Teams. You can read about it here and with the new attachment capabilities upcoming, there could be an even more attractive way for your users to initiate approval flows for uploaded documents.

Until then, …

Power Platform | Power Automate or Logic Apps or both?

If you obsess over whether you are making the right decision, you are basically assuming that the universe will reward you for one thing and punish you for another.

– Deepak Chopra

Sometimes, making a decision inside your project isn´t an easy task. When it comes back to using low-code- or pro-code tools, it remains true as well. We all have been into those situations – using a screwdriver to put a nail on a concrete wall? Well, an easy ask, but what if you have at least two hammer looking almost the same – which one to use? Sledgehammer or fitter´s hammer? In simpler words – which tool for the job?

Which tool for the job? IaaS, PaaS and SaaS

Above visual outlines the home of Microsoft Azure Logic Apps and Microsoft Power Automate, which is important to keep in mind in terms of your preferences and preferred way of going forward. Why?

Same, as I used above example from a DIY project at home, you could ask inside your project: Logic Apps or Power Automate or both? And why is Microsoft making an offer for both?

I guess the last question is an easy to answer – providing options and freedom of choice to empower every person and every organization on the planet to achieve more. The first question though remains a not so easy ask. From the visual earlier, you could think of this being a question of PaaS vs. SaaS. Based on conversation with customers, I‘d say multiple factors come into play here.

From an earlier post you might have recognized me talking about a step out of seeded licensing offer. What I mean by seeded, is using Power Automate as being part of your O365/M365 licensing. Logic Apps is for sure something that needs to be licensed individually. For that, Microsoft is offering a cost estimator which can be used as a guidance. In case of Logic Apps, you´ll be asked for providing estimates on actions, standard and enterprise connector, optional data retention and if you consider Logic Apps to be used as integration tool some additional parameter. Almost same as with Power Automate, when it comes to connectors, there´s a classification in place.

Which tool for the job? Comparing Logic Apps and Power Automate

In the past, if you would have asked Microsoft to provide an answer for which tool to go with, the debate would have been started with personas deriving the answer. If we would ask pro-developers, their preferences could be on the Logic Apps side, due to features like Visual Studio integration or application lifecycle management. If asking Citizen developers their preference might be going with Power Automate instead. Today, you could take a look at information shared, such as this – that aims to help you with a comparison and decision. But what about the myth of Power Automate being way to expensive, if individual licensing is needed?

Let´s say there´s a use-case identified that causes a trigger or connector that is classified as „premium“. In this case, as of writing this article, we got 4 paths to follow:

  • Your flow logic is part of a Power Apps application and you therefore could go with Power Apps licensing
  • Your flow logic is part of an intelligent chatbot and you go with Power Virtual Agents licensing
  • Your flow should be used individually and therefore you go with the „per User“ Power Automate licensing
    (Note: you got two options in here, with attended RPA incl. or without RPA)
  • your flow is going to be used by a department and you consider a „per flow plan“ licensing instead.

All of above paths causes different costs while addressing different needs and purposes. An example: While writing this article the list price for a „per User“ Power Automate license is 12,60€ per user / month or 0,41€ per day [12,60€ *12 / 365 = 0,41€]. This would allow individual users to create unlimited flows based on their unique needs.

Running the same definition with Logic Apps, the math is a little different: Every time a Logic App definition runs the triggers, action and connector executions are metered. Prices are therefore given per execution. That adds a complexity to a comparison, would you agree? We cannot simply compare the price estimation for Logic Apps with the above price for Power Automate licensing. Above pricing would be needed to narrow down to per execution level for a true comparison.

Is there a way for us doing so? Some say yes, let’s take a closer look into the pricing matrix again and find something like „Active flows per user“. For the given license it says „unlimited“ with a reference to usage being subject to service limits. Please review https://aka.ms/platformlimits for more details. That link provides us an information of 5.000 API requests / 24 hours.

If now it would be possible to count the number of API requests within a single flow you could do the math and find out the max amount of flows that could be called within given limits. You would then return to the Logic Apps cost estimation and do the math on the same amount of execution per day. And you would have a true comparison between pricing? Well, you got closer.

The problem with the API limits is that there’s no hard cut off. So technically you could fire more than 5k without running into issues beside a throttling.

Could ROI be our friend then? ROI is a popular metric, because of its versatility and simplicity.

ROI = (Current Value of InvestmentCost of Investment​​) / Cost of Investment

Return on Investment (ROI) Definition (investopedia.com)

Sometimes, it can be hard to use this formula. Comparing Power Automate with Logic Apps, you would need to define what „unlimited“ creation of flows means to you or your company in terms of the current value of investment. So while Logic Apps is a per execution decision, clearly above mentioned tells us Power Automate is aiming to cover more than a single use-case in terms of how it is licensed.

Defining more than one use case, of course you could do a price estimation for Logic Apps as well – don’t get me wrong. And yes, pricing in terms of licensing costs still could look way more attractive then Power Automate until you reach a certain amount of executions of flow definitions.

As I outlined in today´s title, this article is not about Power Automate for each and everything. You’ve found a price difference between both? Can you now find the arguments of why using the fitter’s hammer? Find a couple of useful challenger question:

  • Who is your main audience to serve? Professional developers, Citizen developers or both?
  • What’s your preferred model IaaS, PaaS or SaaS or a mixture?
  • Are you talking about a single-use case only or multiple use-cases?
  • Do you prefer the main Governance happening via Azure only, or do you wish to allow a Governance Team to setup DLP policies?
  • What could be additional methods for determining value?

Making the business case is a challenger. Until then,…

Power Platform | The environment jeopardy bonus round!

You might have recognized my three part episode on environment strategy for Power Platform and walked through Part I and Part II already. If not, this is your last chance before the bonus round, you don’t want to miss.

Today I’d like to somehow answer the question of how many environments might be good, optimum or to be considered. As we started, I introduced you to this visual.

Environment strategy- default + Microsoft Teams

This already could be your optimum, assuming you’re a small company with a lot of work happening inside Microsoft Teams and some SharePoint usage. But there are other companies that might feel familiar with a setup like the following.

Environment strategy- dedicated + Micosoft Teams

In this scenario I would assess your company using SharePoint with customized forms (Power Apps), some flows triggered based on SharePoint Lists. Some of your Teams or your plant workers organized themselves in Microsoft Teams and channels where some apps, flows and intelligent chatbots can be found and additionally, your citizen developers created some medium to large LoB like apps, flows or chatbots in dedicated departmental environment setup. Pretty rock-solid setup and environment strategy. What else?

Environment strategy- shared, dedicated + Microsoft Teams

Well, as I mentioned medium to large LoB apps, chatbots or flows, it might ring a bell in terms of Application Lifecycle Management (ALM). So for this case you could have additional shared environments as sandboxes or production environments (I recommend not using trials for this) and you would develop and test your apps first in those environments by smaller group of people and finally move them to production departmental environment.

Hopefully, this answers your question on what should be the bare minimum, what’s the optimum or best recommendation in terms of the amount of environments in your organization. Do you need to start in the last scenario? Of course not, but your environment strategy hopefully allows for such growth without causing any issues.

If not, time for a review and adjustment with what you’ve learned in this bonus round. Hope you enjoyed the episode, let me know in comments or tweets. Until then,…

Power Platform | The environment jeopardy winner!

In our last episode, I was bringing up the environment strategy again as I was asked around how many of them is to much and what would be the optimum?

So let’s start by digging into the various sources of doc pages that can be found around this by bringing up the basics. First, by repeating myself telling that some admin tasks are no longer bound to licensing. Have a look here:

Additional environments

As you can learn from this visual, it might be a good idea of getting control over who is able to create and maintain your environments inside your organization, especially if the last mentioned admin roles are not a single person or team in your organization.

Assuming you read previous article you’ve seen me pointing on three buckets of persona profiles you first want to control before creating any type of apps, flows or chatbots. You recognized me mentioning security groups and you found the article on how to add a security group to your environment to control access.

Security group added to environment

As you can see from above visual there’re some caveats you should know about. But many times, I’ve been asked what else happens based on the assignment. And that is an automatic user sync. The user sync can be done manually as well, but let’s first get into the details.

Understand user sync in Dataverse

First, we’re talking about an environment with Microsoft Dataverse provisioned here, not an environment which is Micosoft Teams related, where this concept works differently.

Second, some of the users found might be from previous activity (deactivated user for example), some based on non-licensing related (Admins) and some being part of different licensing than you thought (Dynamics 365 licensing in the mix).

User sync continued

But there’s more, especially regarding a recently introduced Office 365 E3 capability- Project App, which technically can be hosted in a non-default environment. And furthermore, some just-in-time (JIT) to be found users that started using a shared app for example and granted Microsoft Dataverse permissions.

Enable and disable users

Before looking into permissions, let’s shortly close the chapter by understanding of how to enable and disable a user. Something to be considered when you create or modify your environment strategy.

Understanding permissions

Now that we do understand how, why and when certain user can be found inside the environment, we finally can talk about giving the, permissions which is two folded.

One seen from the three buckets of persona profiles I introduced in my previous article. Second being specific to Micosoft Dataverse based apps, flows or chatbots considering how to access data rows, tables, etc.

I hope this provided a more simplified overview of items and topics which should be part of your environment strategy and before finally answering the above question, let’s calm down, relax after so many content. Until, then…

Power Platform | The environment jeopardy!

What seems a provocative title for today’s article actually started with a simple question the other day in a design thinking workshop: How many environments are enough or bare minimum or what should be the optimum?

Since an environment strategy is something a Power Platform Admin should carefully think about – wait only a Power Platform Admin? – well in fact since the Admin task of managing and deploying an environment is no longer bound to licensing, this task or strategy should be a combined thought by Global-, Power Platform- and Dynamics 365 Admin roles.

Environment strategy- simplified approach

I guess you feel remembered when looking at the above visual, but since the world continues to offer new opportunities, so we do see new capabilities been added, since I last time reviewed this during a collaboration when writing the Admin Whitepaper. So what has changed since then as many of above rules still apply?

Environment strategy- default and Microsoft Teams

First of all the default environment which now can be added a Microsoft Dataverse as well. Additionally, to Power Apps and Power Automate flows seen from SharePoint, there’s now Power Virtual Agents in place as well. Furthermore, you do have the opportunity to add Microsoft Teams environment (up to 500) based on Microsoft Dataverse for Teams (weather you’re using it to be included in your apps, flows or intelligent chatbots or not). Some of the data of those tools will be stored in Microsoft Dataverse for Teams in the background.

Default environment specials

With that, we should be reminded on some specials that occur for the default environment which you can find in the visual above. But when considering an environment strategy for your company you might ask for more. What’s behind the Micosoft Dataverse to be provisioned or not?

Persona Profiles regarding security

That brings us to above visualized persona profiles. Cause we’re not yet talking about Microsoft Dataverse being used as a datasource for your apps, flows or chatbots – we’re focusing first on the environment access- and Dataverse Maker control. Last one to allow Makers to create new tables, modify or add fields, add relationships and many more.

As best practice, you therefore might like to introduce the concept of using Azure Active Directory security groups for a more simplistic management of all your users being in one of above ‚buckets‘ + managing them to have the right licensing in place.

Keep reminded that some of the task are independent of licensing – therefore adjust your strategy. But before we’re diving into more details where I’ve recognized you would need multiple doc articles to read through, enjoy your day and see you next time. Until then,…

Power Platform | Rise of the (evil) community developer

It was broad to my attention an interesting article regarding today´s topic the other day by Serge Luca (aka #doctorFlow) and obviously a headline like „lots of solutions or lots of problems“ caused me reading through this. I felt reminded of many of my Power Platform related Governance Design Thinking workshops and sessions and thought why not share some key learnings with you.

Pinky and The Brain – Created by Tom Ruegger

Those rembering my style of „simplification“ from previous articles – I was brought back to the mid- to end
90´s and the famous Pinky and the Brain episodes which I enjoyed watching. I thought them could be a perfect ice-breaker when starting any of my sessions regarding Governance, Security and Compliance concerns as it could outline the ongoing between IT and business folks when it comes up to low-code platforms these days. It also adds an aspect of business seen as „evil“ part by IT here. Why?

Power Platform – apps compliance and security concerns

Because of „Shadow-IT“ and many more. Well, let´s take a look at the list of the risks listed in above article to better understand the „many more“ aspect here:

  1. Integration risk: This risk involves exposing data that shouldn’t be exposed.

I would say #2 famous raised concern regarding Power Plaform when it comes up to the +400 offered out of the box connectors and the fact of them seen as „uncontrolable“.

  1. Transformation risk: This risk involves bugs or miscalculations in the app that lead to bad business decisions

From my experience #3 to #4 on the list of concerns regarding Power Platform being not enterprise ready or not offering the toolset needed to monitor „low-code“ of a citizen developer; allowing for proper ALM (Application Lifecycle Management) or vulnarability checks. In addition the article talks about risks introduced by community developers outlined with an example them training a machine learning model.

Low-code – a Team sport

I am not saying ignore those risks or they are non-existing, but I remind my session attendees pretty often, that low-code is a team sport. Nothing more, nothing less. And fusion of development teams, such as pro developers supporting the community developers in a way of not only offering them a secured and monitored low-code eco-system – in fact offering them supportive pro-code extensions and pre-trained AI- or Machine Learning models is a way to look at a successful implementation and low-code strategy to digital transform.

Same as the article wraps up the story as „Managing the risk of low-code platforms“ is key to success, I´d like to point out that Governance, Security and Compliance management is key to an easy digital transformation journey with the Power Platform. I outlined a possible way to kick-start your journey with a visual that can be found inside this article. Additionally, Microsoft offers a pretty good overview to familiarize yourself with activities and actions needed. Yes, it is a lot to read through when doing it on your own. But there´re thousands of consultants and partners out there, who are happy to help you with that.

Got questions? Leave a comment or send a Tweet. Until then,…