Power Platform | Prevent for stalling

You know what any airplane pilot isn´t up for? Dealing with stalling, right?! Well, that´s nothing different compared to software product owners that would love to see their product/project flourish and grow. And same as compared to an airplane operation there´s a secret or hidden force which prevents from stalling. A force, that kept in the right balance, will keep your airplane/project on track and safely fly from A to B and following routes.

Today I´d love to talk about this secret or hidden force that it takes for your Power Platform projects to flourish and keep growing in a balanced way. May I introduce you to the Power Trinity.

Visual showing the Power Trinity – three components for ensuring customer realizes value

I´ve recently seen Microsoft MVPs like Mark Smith starting discussions on Center of Excellence being dead and instead raising the momentum of a Center of Enablement. In addition to that, Steve Mordue asking to agree or to disagree with a mountain of shitty little apps being created by „Citizen Development“. You might have recognized those posts as well, if not it might be a good idea of studying them first, before continue reading here.

With the majority of customers I´ve been in talks and them being on their journey of successfully adopting Power Platform and low code tools as part of their developer community toolchain, all of those initiatives remain dependent on the Power Trinity. The secret force that keeps Power Platform projects alive and flourish if balanced. But for some customers this hidden force is an unrecognized one when taking their seats as the airplane pilot. They think of a quick round with a flight simulator (which you could compare to getting to know what others kept doing or studying best-practices and copying them) would preventing them from stalling and all of a sudden they are into this situation.

You could also think of a pilot that has been operating an A330 for many years, thinking of the knowledge of this would enable to operate an A380 easily. That´s almost like IT operating the Power Platform like they did for many years with Microsoft 365 and consider Governance to working the same and more Governance will help to operate the A380 within minutes.

Digital transformation with low code Tools

Well, we all know change isn´t an easy thing and neither something that could be easily trained for in a simulator. Learned things also cannot be taken as for granted and remaining the same. Is it any different, when using low code tools for driving Digital Transformation? It takes people, culture and a future-oriented mindset which sometimes feels risky, uncomfortable or new. When you´re about to start your journey or you´re in the situation of „stalling“, you should ask yourself if you´ve prepared your flight crew on that airplane the right way – in terms of using the hidden forces that holds your plane successfully in the air. As with flying an airplane – the setup is key to success. But let me deep dive a little bit on that.

The „power“ house that drives Power Platform adoption success

Did you know that same as with an airplane it takes a successful crew to operate it? Power Platform adoption is a multitude of jobs, roles, people and „forces“ coming together for success. And the more you´re prepared as a crew, the more it flourish. Take for instance as a best practices that customers realizing value, have implemented a house that contains at least 6 rooms. One, where you would find a Governance Lead in, another one, with an Application Lead and a third one with a Skilling Lead. All of them working on parts of the successful flight plan, regularly meeting and of course prior to the next flight align on current plan.

In another room you might find Adoption Leads or Champions which you could think of weren´t born as this. They kept growing and raising and of course started to become Mentors for others. If you haven´t yet thought about this in your plan, think of it like you missed the ground crew for maintaining your airplane. While in the air, you may not need them. But when on ground, you can´t live without them to get ready for your next endeavor.

The 5th room is basically your pre-flight deck, where everything regularly will get reviewed, key learnings taking out of your last trip and moved forward as key performance indicators for your next trip – the Exec Steering Group. And last, but not least, in the 6th room there´s the flight control tower. Without the operator in here, there´s no save starting or grounding. You may operate on auto-pilot for some landings, but the supervisor will still be in the background. And the same remains true for your Power Platform journey doesn´t have an Exec Sponsorship.

Visual outlining a successful guidance to accelerate Power Platform adoption journey

The secret or hidden force, therefore can be planned for with a simple framework that many Azure customers already might be familiar with. Cause the steps also can be found in the Cloud Adoption Framework. If you thought you can skip any one of these or fast forward by following only what others did, you´re possibly in a situation of seeing a „stalling effect“ any time soon. You can still prevent running into it, by adjusting your plan and re-initiate preparing your people.

The Power Trinity will help you ensuring to prevent for stalling. Only a balanced Power Trinity will ensure your Digital Transformation to flourish. If forces are out of balance you´ll recognize a „stalling effect“ and you should recognize the signals carefully, like a good pilot that – even on auto pilot – keeps tracking of what´s happening during the flight.

Let me know your thoughts, and if you wish to read more about the Power Trinity in future. Until then,…

Power Platform | May the 4th of AI be with you

With all the noise around generative AI and Copilot´s being added to almost any Microsoft product line that most of us are aware of, it´s no suprise that in recent conversations generative and especially responsive AI becomes a top topic. On May 4th, it´s therefore no suprise me running into another conversation around AI becoming either to support or being in need of becoming banned to trasnform people working. Us using artificial intelligence for quite a while now, sometimes without us knowing about it, it becomes obvious that AI becomes casual and success stories around the short-term growth of ChatGPT community will become part of pretty much every industry strategy we´re aware of.

Visual outlining the transition of people working side by side to side by side with AI

I´ve recently finished reading the book „The Age of AI“ by Henry A Kissinger (Author), Eric Schmidt (Author), Daniel Huttenloche and came across this interesting part:

Although AI can draw conclusions, make predictions, and make decisions, it does not possess self-awareness – in other words, the ability to reflect on its role in the world…But inevitably, it will change humans and the environments in which they live.

The Age of AI, Book

This one certainly opens the field of discussion what kind of responsibility is needed in terms of the IT department for instance serving as a service provider and allowing employees to use AI in context of their work. Furhtermore, enabling Fusion Teams inside their development projects to use AI services and even publish those services to end-users in a secured and governed manner.

When it comes to Star Wars (May 4th), we all know that the good and the bad force is ironically fighting for becoming the leader to transform the universe. But in the end it´s guidance and governance that leads to the saga we´re all aware of. When it comes to context of Microsoft AI solutions, it certainly points to running a conversation around responsible AI and how the Power Platform inherits these principles. This concept isn´t new and I´ve been talking about this for quite a while now and even earlier, in October last year, I published an article about this.

Visual outlining Power Platform & Microsoft´s responsible AI principles

While Makers (The Luke Skywalker´s) certainly do have an interest to learn and upskill, and even though first trial might be kind of „let me play and fail or succeed with it“ – there´s certainly a high responsibility on IT- and Security teams (Master Yoda´s) to help govern, secure the company usage of AI services, and obviously train the Makers to responsibly use AI, such as Azure Open AI or Power Platform´s AI Builder. Therefore, starting conversations around responsible AI and upskilling end-users (Luke Skywalker´s) you could use above visual to talk about the principles that should be implemented insight each company. When prompting Large Language Models to gather the insights via AI or generate new data based on it, it should become an obvious task to cross-check results by humans and get rid of possible hallucination.

And while some of this can be pre-scripted, implemented via Governance or DevOps tasks, it´s quite important that every Maker understands the companies intention of using AI services for the good. Microsoft in partnership takes care of offering Azure OpenAI service – so that passing data to 3rd party AI services can be avoided.

To best find a working governance and security concept when using AI services, it is beneficial to understand what kind of business use-cases GPT potentially unlocks. For a rough overview and as a conversation starter, I am sharing the following visual.

Visual outlining typical business scenarios GPT can help unlock

Using these examples, it can be beneficial to define first-class company enabled AI business scenarios that afterwards a Governance & Security + a Guidance can be worked on. Remember it is Master Yoda and Luke Skywalker train together and side-by-side. Prompt engineering is kind of a new skill that we all need to familiarize with rapidly, same as we figured out over the years of using search engines on how to refine our researches. None of us is born as prompt engineer, though we will add this skill over the course of the next years to come, to more efficiently and effectively perform our business tasks.

May the 4th be with you. Until then,…

Power Platform | Pipelines – Design a security concept

You may have heard about my mini-series I did for reviewing Power Platform Pipelines. The easiest way to check this out is starting here, if you missed it. Ever since there´s been an episode on XrmToolCast with my MVP friends Scott and Daryl where we had a chat about the Application Lifecycle Management capabilities around this new Power Platform Power Apps in-app experience. Thanks for having me on your show. You prefer watching the video instead of just listening? Here you go. Today, I´d like to review the security roles that are brought with this solution and providing some food for thought on an architecture design which I recently reviewed from an enterprise customers planning to use Pipelines.

The Admin security role

Let´s start with a look at the role that you would assign to those typically in DevOps who would overlook the ALM process and should be able to setup, monitor and inspect the various deployment pipelines and -stages. You would assign them the Deployment Pipeline Administrator role.

Overview of privileges granted to users assigned the security role – using XrmToolBox Role Documenter

Remember these roles are coming with the solution itself, which means you will find them in every Host you´ve installed the Pipeline solution for. This is pretty important in terms of the current limitation that you need to have a Pipeline Host in every region you are having Development, Test/QA and Production type environments. In other words a Developer creating the artifacts in a US region based developer environment will not be able to deploy the solution to EMEA or ASIA based Test/QA or Production type environments.

As you can see from the screenshot above your users in DevOps not only would be allowed to administrate Pipelines, they are also allowed to use Pipelines created. The permission granted is Org-wide (green circle). This is important to understand when you´re considering the role assignment been done not on a per User level, instead using Azure Active Directory Security Groups and assign the role via the membership of this group.

To learn more about this concept, I recommend yours watching the recent The Low Code Revolution episode where Daniel and Prabhat are talking about this in more detail.

Enterprise Architecture

As I outlined earlier, I was tasked to review an enterprise architecture design for using Pipelines in an organization. To provide more context, think of multiple developers would work in shared developer environments. It was essential to challenge, if all those developers should be assigned the Deployment Pipeline Administrator role, due to specific security requirements. Furthermore, developers where basically located across the globe which in terms of using a single Pipelines Host would fail due to the fact of having multiple Test/QA and Production type environments being in different regions. The customer furthermore considere more sophisticated developers to be able to create their own pipelines in each region them developing solutions for. Them being capable of managing, monitoring and inspecting of what´s ongoing + supporting other Makers using Pipelines to deploy their solution artifacts as well. Obviously this becoming a challenge everyone being granted the above security role.

So what´s an option here?

The user security role

You may ask now, if the user security role might be the solution to be used in an enterprise architecture. So let´s find out by taking a closer look at it.

Detailed view of Deployment Pipeline User security role – using XrmToolBox Role Documenter

Reviewing the privileges granted with this security role, you can see, why a deployment pipeline created by an Admin would need to be shared with the User for them to see the pipeline becoming available in the Pipelines solution. The security role grants them user-specific read permissions for both the Deployment Pipeline and -Stage table to allow for exactly this scenario.

In an enterprise scenario you again would make usage of Azure Active Directory Security Groups to assign this role via membership of the security group, instead of going user-by-user level, sharing and granting privileges.

Enterprise architecture continued

In our enterprise architecture where multiple Makers should be equipped to deploy their created artifacts via solutions from Development to Test/QA and finally to Production type environment(s), you again need to consider the privileges granted with this end-user Deployment Pipeline User security role. While for the majority of Makers (think Business Technologists) this role and the sharing of Pipelines by Admins might work in practice, more sophisticated Makers might ask for a kind of interim role, which „sits“ between both provided out of the box security roles. Personally, I would love to see another role being added where more sophisticated Makers would be allowed to at least create, monitor and manage their own Pipelines without being granted the high security privileges on the overall company-wide Pipeline Host(s).

With the upcoming Pre- and Post-stage extension which would allow to implement approval and review scenarios prior to deployment stages, this role could also help with the Fusion Teams development concept. I´ve been creating software artifacts within both Dynamics 365 and Power Platform from the early beginning the solution concept being introduced. A rule of thumb always was to have a review done by another person as you tend to test the code / solution in the way you designed / developed it and this may not be the only way your users consider to use your solution. Another aspect was to collaborate and develop in teams and merging artifacts to compact solutions prior to them being deployed.

In our current architecture design review, we ended up going with a Pipelines Host in each region for managing Pipelines. We created a custom role as a copy of the user-role to allow for more sophisticated Makers becoming „Managers“ of their own Pipelines while DevOps Admins remain responsible for an overall quality / review aspect in each region. A concept that was inherited from the CoE implementation of this customer.

As always, let me know via the comments what your experiences are with using Pipelines in your company. Until then, happy easter holidays for those celebrating or taking a time off to relax and reload energy…

Power Platform | Business Applications Launch Event Review

So it was that date in April yesterday, the Power Platform community looking for a fire-side chat with Charles Lamanna and his leadership team introducing the latest insights on what´s upcoming for Power Platform and Dynamics 365. Those, missing the event could sign-up and watch the recording on demand. What was your take? Will we finally see natural language in programming becoming a no-brainer? Will we see a transformation of developer community all being part of Fusion Development teams and instead of hanging out on Stackoverflow or Power Platform forums looking for code snippets or fixes, get those with the help of Copilots for Power Platform, Visual Studio or VS Code? Will the next generation of applications auto-include Copilot, as a users no longer would dive into data via Search, Views and Filters, instead looking for a prompt & response user experience?

Charles Lamanna, CVP Business Applications and Platform – introducing the new era of „big AI“

It´s no suprise with all the amount of AI announcements over the last couple of weeks, even the Business Applications Launch Event was heavily focusing on generative- and responsible AI. Introducing and outlining the plans to use AI in two variants

  • AI to be included in user experience as an assistant to become more productive
  • AI to be used for helping developers to become faster than ever before
Power Platform and Azure ecosystem to transform and boost development

Though both AI narratives seems to define the start of the next era, there´s no doubt that there´s a lot of questions still floating around in terms of the AI momentum we see in the software technology sector. It was Steve Jobs, who said:

You can’t connect the dots looking forward; you can only connect them looking backwards. So you have to trust that the dots will somehow connect in your future. You have to trust in something – your gut, destiny, life, karma, whatever. This approach has never let me down, and it has made all the difference in my life.

Steve Jobs

Expand visibility, reduce time, and enhance creativity with unified, AI-powered capabilities

Taking a look into this, we´ve seen Julie Strauss outlining what Developers (Makers included) could expect from the Power Platform side to fulfill on these narratives.

Excerpt slides presented outlining the release wave upcoming until October 2023

With all-in-all over 105+ features to become available in this release wave, we hopefully can expect some non-AI related features to be shipped as well, improving the platform all-up. So a good place to start regularly visit is the release planner where to find more detailed information around upcoming features.

The „easter egg“ to be found inside the Power Apps Studio where finally we´re going to see a preview capability to check out the different device formats when building a responsible application that works across different formats and devices.

Power Apps Studio – preview app experience for different device formats

Innovate without limits using the latest in low-code development, including new GPT-powered capabilities

While not explicitly shown during the Business Applications Launch Event, a fun fact around Infusing AI circling around for more than 4 years now, starting with the disruptive AI Builder service as part of the Power Platform, which made it easier for any developer to consider AI services becoming part of their created artifacts.

Infusing AI since 2019

Another easter egg to be found inside the Bonus material – first time available where Walter Sun was outlining the guiding principles to responsible AI, which I believe could be seen as success factors for using AI in general.

Visual outlining Microsoft´s six guiding principles to Responsible AI

Wrap up

It will be up to us if we manage to use AI capabilities in all six guiding principles or even more. And the near future will proof if IDC´s prediction outlined by Charles as an opener

IDC Future Predictions, 2022

will become a reality. Let´s first await the announced features becoming available in EMEA regions to help with EU-data boundaries and exploring this new technology without being worried data crossing EU boundaries in an uncontrollable manner.

My personal hope is AI not dazzle the additional improvements I am talking with customers about, like the sun in spring season that started shining the last couple of days here in my hometown and you need sunglasses to really enjoy it. May AI become a helpful tool in everyone´s toolchain, but not more than this to address the next generation challenges and requirements.

Looking forward to review the features upcoming in detail and start having conversations with customers on how to improve those services or features being offered. Until then,…

Power Platform | Business Value vs. Cost optimization

When it comes to application development or modernization of legacy apps with low-code, specifically when driven by Citizen Developers, or as I like to call them – Business Technologists, often times there´s the question in the room about business value assessment. Those rapidly created apps, some tend to call them out as „crap“ as they haven´t yet understood that low-code is to empower everyone. So wether an app is crap or not isn´t about the Creator. Back in 2021, I´ve already written an article around this. But today, I wanted to revisit this topic based on a recent conversation I had. A customer asking me, if I would be familiar with the 4Rs from Gartner as a strategy for cost optimization.

4Rs cost optimization strategy with Power Platform

So we started our conversation with above visual. First, by aligning on the influencing factors surrounding and driving the need of continuous thinking about a cost optimization strategy in general. I then asked him around the 4Rs to be in-sync with them and I got the following as a result (also found in the inner area of above visual):

  • Reduction
  • Replacement
  • Rethinking
  • Reinvesting

Making an invest into Power Platform from a Microsoft 365 perspective already, but then starting to boost innovation by reinvesting into premium or standalone licensing to ensure usage of all features available in Power Platform. Some customers still struggling with this approach.

A typical low-code journey

I therefore enriched our conversation with the above visual showing a typical adoption journey of a Power Platform customer and outlining why there could be a struggle with the investment in features and capabilities. I shared with them in terms of Change Management they might started their journey with being confident about what kind of use-cases to be selected for low-code development. I challenged on the orchestration of the platform in terms of stepping outside the comfort-zone and introducing an Environment strategy, which takes into account that the Default environment is not the „container“ for storing all kind of business use-csaes. I asked about a quick maturity level assessment on their Maker community and we finally talked about the business cases and possible business value assessments.

While doing so, we obviously needed to catch-up first on how they were going to select their use-cases and decide on tools to develop or deploy them to production. Many times, it´s Makers having an idea that runs through an ideation & creation process first, to ensure that all DevOps tasks are in sync with the idea of the Maker. Especially, if those ideas end-up in being business critical use-cases and not just for personal productivity. We kind of assessed from a „runners“ perspective, if them operating in a Ready, Set, Go or even Boost scope already.

Identifying use cases for low code and decide on tools

Discussing about the 4Rs shown earlier, I showed above visual and asked the customer, if they would be using a methodology like this, which would ensure to assess the use cases from two different angles. If asking, why I selected this visual – it´s coming from Gartner as well, and I was considering based on the customer challenging me with some Gartner stuff, that they consider to follow Gartner´s best practices. This triangle between Outcome, Business Value and Feasibility starting by using a simple to understand matrix for running various assessments with. On tools to be used by Creators or

On Key Dimensions

This would be to first identify if the scope of the use-case is either short or long-term running. Data included or needed for this use-case either being unstructured or structured. And the scenario being either very dynamic and constantly changing or an automation of a typical routine or repetitive task or bundle of actions.

On Key Performance Indicators

The third part would allow them to assess their use-case catalog against Key Performance Indicators (KPIs), such as the TCO from a buy vs. build perspective, or the Faster time to Market and Value, due to their development process taking into account the strengths of Fusion Development teams vs. traditional software development. And of course, let´s not forget about the Innovation factor.

Conclusion

We came to the conclusion that the importance of a good ideation and creation process, incl. an inventory management with for instance having a use-case catalog, is part of a good business value assessment strategy. Not having those instances, would make it a lot more effort to run business value assessments internally, as well as deciding on which use cases to take into account for this.

As said earlier, there´re tons of personal- or teams productivity use-cases out there, where only one or a small group of employees gets empowered by a rethought process supported with Power Platform tools. But is it worth to estimate all kind of KPIs for this if the key dimension for instance is a short scale or lifetime? Should you care from a business value, or just let it run and assume that the mass of all those use-cases add enough business value in general?

We couldn´t end our discussion by obviously taking a closer look at the big trend surrounding a reinvest into Power Platform – Generative AI.

Power Platform leading a new era of AI-generated low-code development

We used above visual to further discuss the various Copilot announcements and benefits any Maker, Creator, Business Technologist or Developer would have by reinvesting into Power Platform – considering the usage of standalone or premium licenses and being able to use created artifacts from run-time experience. Why run-time experience?

Well most of the Creators these days are equipped with the Developer Plan license that allows them to create artifacts at No cliffs. They decide on best technology to be used for enabling their use-cases rapidly, creating faster time to market and value and being more agile in terms of including a change management, as requirements might vary over the run-time or application lifecylce.

Having that flexibility compared to making a decision case-by-case obviously can be easier done by having a couple of use-cases identified to run business value assessements for. Who doesn´t like to have a comparison of investment vs. pay-out these days? And circling back on all the influencing factors that makes an impact on the 4Rs, while any of those should be part of a cost-optimization strategy, it´s about adding the low-code orchestrated platform and Fusion Development teams factor to a future ready IT & Developers strategy.

Generative AI will make it even easier for any developers to rapidly modernize legacy applications or create brand-new applications and processes. Think of a non-COBOL skilled Creator who got asked to transform some code into cloud-ready API. Why shouldn´t this person ask Generative AI to come-up with a solution, instead of learning the COBOL language in it´s fullness just for this replacement? (And yes, if you ask why I am using this example: I needed to learn COBOL just for passing one of my Masters)

Generative AI just started and we will see more capabilities added rapidly. In terms of Business Value Assessment and using my outlined approach to decide on low-code or code-first, tools next – at least this customer now feels more empowered after having this discussion. Let me know about your experiences with Business Value Assessment inside your company. Until then,…

Power Platform | Pipelines the code-first developers view

Today, I´d like to finish my small series of a first review on Power Platform Pipelines. This time from the view of the last user role being missed – the code-first developers. Sometimes also referred to as pro developers, I´d like to call them code-first in favor of not saying that low-code developer can´t be professional developers as well, bare with me – we´re all Creators. The beauty of Pipelines being designed to really meet everyone from Admin to Maker to every developer that wants to benefit from Application Lifecycle Management (ALM) made easy. It´s as simple as a count to three.

Code-first view using VS Code

Visual Studio Code with the Power Platform Tools extensions installed – Termin window

Let´s start with running immediately into an error message thrown, as I was switching heads to a code-first developer opening my VS Code with Power Platform Tools extension installed running the pac pipeline list command. I authenticated against my PowerFx Development environment as this is where my artifact development is ongoing. Being a Maker, I would have selected my solution in here, and with a click on the rocket-icon I would have initiated the deployment via Pipelines.

After running the command, I was presented an error message that the segment „deploymentpipelines“ cannot be found. Ups! What went wrong here? Well, I guess I didn´t really switched heads! From a code-first developer perspective, I should have known that I need to be authenticated against the host where my Power Platform Pipelines solution is installed. So I fixed it by switching my authentication to Power Pipelines – my hosting environment.

VS Code Terminal – using Power Platform Tools

After running the command again, I was presented with the information, I was searching for. Before using any of my pipelines configured, I need to figure out their ID. After getting this ID, I can then retrieve more details around the Pipeline stages. Another information that I would need for using Power Platform CLI to deploy my solution artifacts to my Test/QA environment.

By using the command pac pipeline list -p [GUID of pipeline], I can now get those details as shared in above visual. You can see that my PowerFx Deployment Pipeline Citizen Development contains two stages

  • Move to Production with ID starting with c77dd0
  • Move to Test with ID starting with 9d7f07

The Deployment process

Next, I want to use the Move to Test stage for deploying my solution from my PowerFx Development environment to my PowerFx Test environment being a code first developer, but using Pipelines.

Power Apps Studio – PowerFx Test environment

Above visual shows you how the Test environment looks like prior of using Power Platform CLI to deploy my solution. With the information collected from previous steps, I am now ready to deploy my solution.

VS Code Terminal – Power Platform Tools

The command I am using this time is pac pipeline deploy with a couple of parameters. Those are:

  • -sn which is the solution unique name. I can get this with a right click on my A small app Version 1.0.0.1 solution
  • -sid I got the stageId from previous command which in my case is the ID of Move to Test stage
  • -env this is the GUID of my development environment. Get it with a right click on the environment in the list and copy it
  • -cv which is the current information that is gladly listed inside the environments & solutions tree (1.0.0.1 in my case)
  • -nv the new release version which I increment as I increment to 1.0.0.2
  • -w asking me if I want to wait for the deployment itself

Okay, are we ready to run this command? Let´s give it a go, folks.

VS Code Terminal – Power Platform Tools

The deployment process starts, performs all kind of checks, then performing some tests prior to deployment. In case you don´t have any issues inside your solution (like missing dependencies for instance or other errors), you should see the deployment running and finish successfully.

Power Apps Studio – PowerFx Test

Switching back to Power Apps Studio and using a browser refresh, I can now see that my solution successfully deployed into my PowerFx Test environment and the version number is shown as provided.

This is it! This is the code-first developer experience using Power Platform Pipelines. Looks awesome smooth, right?

What´s in it using Pipelines?

While code-first developers might already be familiar using Power Platform CLI to unpack and pack their solutions and deploy them across environments, the beauty of them using Pipelines is their deployment process becoming monitored and governed by Pipeline administrators. What do I mean?

Power Platform Pipelines – Deployment Pipeline Configuration

In terms of looking into the Run history of the Deployment Pipeline Configuration app you see both my deployment runs monitored inside the tool. That means the created artifact is also stored as a managed solution zip file and can be downloaded from the app. Furthermore, all the data collected during the development process is stored inside the stage that I´ve been using to deploy my solution.

A perfect combo?

In this case, I do see a large benefit of Fusion Teams Developers using Power Platform Pipelines and therefore monitoring their ALM activity as well as allowing for debugging in case of an error after deployment. With the improvements announced upcoming this experience will obviously getting even smarter.

If you´re interested to learn more about ALM with Power Platform, here´s an upcoming webinar starting April 6th for you. So sign-up and enjoy to get some insights.

Wrap up

While this might not be the last article and review on Power Platform Pipelines, this concludes my mini series. If you´ve missed the previous episodes, find them here:

Hope you liked it and let me know your thoughts on experiences when using Power Platform Pipelines either via comments, a tweet or LinkedIn answer. Until then, …

Power Platform | Manage „standalone“ license requests

Today´s article is proudly sponsored by Power Platform Admins. Those managing license requests in their organizations and thoughtfully considered an easy onboarding process for their Makers during an ideation & clearing process prior to creating artifacts. Jokes aside folks, today´s article is a walk through in terms of this recent announcement.

What was the first reactions, I got during recent customer conversations with IT teams? Right, where to block or deactivate this. But hey, this isn´t an evil thing that causes a lot of headaches or trouble! This is for you Admins making the process a lot smoother and tailored to your needs. But let´s take a closer look so it becomes clearer for everyone.

Power Apps license request dialog, which can be customized now

What is it good for?

To simplify and customize the process of Makers in your organization requesting a license that is enforced by the artifact created using premium capabilities (the app or flow tagged as „premium“ inside the Details section – License designation).

For whom is it good for?

It addresses the requirement of admins being in need for a customizable dialog everytime a user is enforced to have a standalone license for the run-time experience of a created artifact by a Maker inside your organization. Furthermore, it simplifies Makers life as everyone now can be pointed to a customized or tailored documentation instead of using the in-built experience.

A closer look

Let´s start from the M365 Admin Center user interface and see where license requests without this new feature being enabled would normally end-up in.

Admin Center License Management – Incoming or Pending requests from users

Let´s assume you haven´t activated this new feature and a Maker created an artifact which enforces having a standalone or premium license. The artifact in my test cases being a canvas app that is using a premium connector (Dataverse) and a Power Automate flow that is also using a premium trigger (Dataverse) and even a premium action (AI Builder).

A Maker with just an M365 E5 seeded license assigned creating a premium tagged artifact within Default environment

The Maker experience – Power Automate

A Maker has created a premium tagged flow in above example and was able to save it. For a run-time experience of the flow, the Maker is enforced to have a valid license. As in my case the Maker is only assigned an M365 E5 license with Power Automate and Power Apps seeded capabilities, this flow needs a standalone license.

The Maker therefore receives and error message or notification that requests to use the flow checker to troubleshoot.

Power Automate Flow checker – A hint on how to solve the license issue

The Maker in my case can either sign-up for a trial, buy their own license or perform a license request.

The license request

In this case – or for testing purposes – let´s click on license request link above and see what´s happening.

Power Autoamte – Make a license request dialog

As shared in visual above, a dialog is presented and I can enter a message for my admin as well as select a license type that I think would best be suitable to fit my needs. I then press Submit request and this initiates a request.

The Admin Center experience

Switching heads and shoes now and stepping into the role of a Power Platform Admin again, I am opening my M365 Admin Center and take a look into pending license requests.

Microsoft 365 Admin Center – License requests (inbox)

I can see that in my case Amy made a request to be assigned a license and from here I could immediately work on resolving this.

Microsoft 365 Admin Center – License request Admin dialog

As you can see from the visual above, if I would have licenses available in my tenant, I would have been able to select a license. In my case, I am requested to buy a per user Power Automate license as I don´t have any remaining licenses in my tenant. You can also see that I would be able to send an additional message to the requester and then hit either Approve or Don´t approve.

Well, that´s the standard if you don´t setup to use your own existing request process instead. And this is what has been introduced with the latest announcement following the documentation.

What´s the deal then?

As an Administrator you can now activate to use your existing request process instead. And that allows you to customize the message provided to the Maker and also allows for further guidance on the company license request process in general. Let´s take a look.

Microsoft 365 Admin Center – customize the request process

Me being an Admin and knowing that I´ve created a tailored company-wide process of requesting such licenses, I am customizing the request process in above dialog. I am therefore providing a Message and I could also add a link to a documentation.

The Maker experience – Power Automate

After this request process has been customized by an Admin, I am receiving the same notification inside my created Power Automate flow. So I initiate the request process again.

Power Automate – Customized license request

But this time, I am receiving the customized dialog with the message from my admins and a URL that either forwards me to a landing page where I can follow instructions.

Maker experience – Power Apps

So how does it look instead from a Power Apps studio being a Maker where I´ve created a canvas app using Dataverse and therefore enforcing to have a standalone run-time license?

Power Apps license request dialog

In this case I am also receiving a notification or message from my Admins and I am provided a URL that I can click on.

Okay, got it! What´s in there for me?

Well, the beauty of using the customized license request experience instead of the built-in experience is to consider a broader guidance of Makers creating artifacts which causes standalone licenses in your organization. I didn´t shared everything with you so far. As the URL that is optional can not only point to a landing page where Makers follow instructions on how to request standalone licenses in your company.

It could be the URL of a Power Apps app that guides the Maker via a Q&A clearing process to request licenses needed for the run-time experience of their created artifacts. You could use Microsoft Power Automate Approvals for instance within your application that allows for the same capabilities of approving or denying the request. You could extend it to include requesting more detailed information prior approval. You could also create both an individual Maker and Admin user interface that allows for easy management and monitoring of these requests. The limit is just your imagination.

So instead of immediately asking for: Where can I block or deactivate this? You may better check out the flexibility and options provided by this „little“ announcement by the Power Apps team. As always, let me know your thoughts and questions via the comments or social media. Until then,…

Power Platform | Pipelines – the security model

You may have already recognized me stepping into the Power Platform Pipelines topic from two different scopes. So I started with sharing some thoughts via an article that concentrates on the role of a Maker using Pipelines in their creation process of solutions. I then followed by taking a look at the Admin experience, before rejoining the Maker path again – talking about them using Environment Variables when deploying their solutions to various environments, switching connection reference endpoints – e.g. like an SAP System in Dev-, Test-, and Prod-Environment where each of them obviously is having a different IP address.

Today, I´d like to switch back on the Admin side again, talking about the security model behind Power Platform Pipelines. Sharing some thoughts around this and providing some tips of avoid running into a long-term maintenance issue.

Power Platform Pipelines – incl. Security Roles

When you install the Pipelines solutions and configure it to run inside the Pipelines Host environment, you do have two security roles that are deployed and offered out of the box. Those security roles should be considered to control the access level of the Pipelines Deployment Configuration App – or who is having access in which role.

Let´s say in daily practice and operating in a DevOps model, a team of Admins would need to be assigned the Deployment Pipeline Administrator role, if you want them to have full control over all pipeline configuration, without assigning them the system administrator security role of that Host environment. Those Admins would then be capable of designing and configuring pipelines as well as sharing those records with other Makers – them to become enabled of using Pipelines from within one of their Environments (Dev, Test, Prod – to name examples).

Power Platform Pipelines – sharing pipelines with Makers

Admins therefore would typically use above UI to share the created pipelines with Makers inside their organization, which could be done individually (User by User), a team (offering Team name) or centrally managed via a security group. In any case you only need to assign those Makers the Read permission.

In day to day manner, you may now carefully consider the amount of work in the configuration setup of this via UI vs. automating these steps and include it as being part of your DevOps process in general – how? With the help of Power Automate as an example.

I should also mention, that the user using Pipelines in their Development, Test or Production environments also should be assigned the Deployment Pipeline User security role as seen in first visual. Only this combo would allow them to use Pipelines. But wait, there´s another note which should not be missed – you need Export- and Import-solutions permission in both your Development Environment(s) as well as in any target Environment(s) you would deploy to. So you would need to consider a security role for this being assigned in each of those environments to them. Only this combination would allow you to have Pipelines running appropriate.

Overview – Active Deployment Pipelines

Considering now, you are one of those DevOps / Pipeline Admins and you have a lot of Developers and Environments to maintain, where could be a „shortfall“ of the current user experience? As you can see from the visual above, handling just a couple of active deployment pipelines might not be a big issue. Though, what if this list grows and there´re growing numbers of active pipelines to be maintained?

An issue this user experience – nor the current offered Power BI dashboard solves is: Which are the users you have been shared with those pipelines? A click row by row and opening the share dialog window, which will show you this information from UI perspective might become inconvenient over time, especially this list of active deployement pipelines growing or being a certain amount of rows for your organization.

XrmToolBox with FetchXML Builder

Lucky those, who know there´s a third party tool called XrmToolBox which offers a tool created by Jonas Rapp called the FetchXML Builder. This tool allows us for using the FetchXML query language to receive information via the linked entity who we shared a row with.

In above visual I´ve created and executed this FetchXML query and as you can see it shows my two active deployment pipelines that I´ve been sharing directly with Carl Citzen (as you can see from the 2nd visual). You can´t see Carl Citzen as Display name, instead, you do find an attribute SharedWith_UserId in above screenshot, which contains the User GUID for Carl.

This tool also allows us to take a look at the FetchXML itself.

XrmToolbox – FetchXML Builder showing the FetchXML structure

So for those not yet being familiar with Power Platform customizations and the structure of the view query language, you can now see how this looks like. Why am I sharing this?

This might be the starting point of considering to build another dashboard view for Power Platform Pipelines, that in daily practice would allow yours to monitor who you or your colleagues shared with the configured pipelines! Over the course of time using it, you certainly will no longer remember if you shared a pipeline with a single user, a team or a security role.

Conclusion

In a DevOps team, where certain tasks for monitoring and administration of ALM are to be executed – automatically, I would really love to see a dashboard or view like this offered out-of-the box in a future release of Power Platform Pipelines. In an enterprise world of having hundreds of environments and dealing with a couple of them being activated for usage within Pipelines + dealing with a larger amount of Makers that for maybe are aggregated or „hidden behind“ security groups, you certainly want to maintain your ALM solution in terms of a qualified auditing process.

Those system customizers, Makers or Developers being familiar with Power Platform in general might easily understand how to customize a solution that fulfills on those goals, though it´s not yet a configurative experience.

Please continue sending me your thoughts on using Power Platform Pipelines in practice and don´t forget: Please do keep it coming! You can comment this announcement and in the Power Apps community. Until then, …

Power Platform | The rise of the co-piloted Business Technologist

A year back I was publishing a blog post around the rise of a new form of developers – Business Technologists. Those developing and creating new or modernizing applications with the help of low-code tools such as Power Platform. After a year passed by and with yesterday´s announcements from Microsoft around AI infused development using a Co-Pilot, I thought it´s about time to publish this post about the evolution of the Business Technologist.

AI and Low code are the future in development

With their specific knowledge around the business use cases and their passion of learning some technology to be used, it was Business Technologists driving the main part of adoption for Low code inside their companies. Professional developers nevertheless were infused by AI in their development process as well, using for instance GitHub Copilot to faster create software code. Taking a look at the numbers shown in above slide, it seems obvious that #GenerativeAI can´t be ignored or simply called out being „the evil in the game“.

CIOs in fact thinking about how to include AI from multiple perspectives in their IT strategy. A most obvious one around fulfilling on the continuous goal of cost savings on IT budgets. But as we´re humans, we also tend to show some scepticism around this massive AI announcements recently flushing tech news. So it´s no surprise also asking – what´s the best strategy to secure and govern the AI usage inside a company or as Satya said:

„As we build this next generation of AI, we made a conscious design choice to put human agency both at a premium and at the centre of the product for the first time, we have the access to AI that is as empowering as it is powerful. Of course, with this empowerment comes greater human responsibility. Just as an individual can be aided by AI, AI can be influenced positively or negatively by the person using it.

Satya Nadella – The Future of Work: Reinventing Productivity with AI

In one of my previous articles, I was talking about the importance of a flexible, scalable Environment strategy for Power Platform inside a company. So where are the relationships to AI?

Overview of an Tenant Environment strategy with Data policies applied

We all know that one of the most flexible capabilities of Power Platform is the usage of the connectors framework. Either using one of the out-of-box offered, creating yourself a custom connector as an API wrapper, or using an independent publisher created connector, such as this one. In above visual you can see an example of what it could look like, a Business Technologists inside your company reaching out to Azure OpenAI services based on most recent offers.

Depending on the DevOps team of Power Platform setting up a DLP policy a reply like:

This operation violates admin data policy ‚Ultra Low (All New Environments)‘, which restricts the use of connector ’shared_openaiip‘.

could be the result if a Business Technologist may want to reach out this service via a connector available. You can see that using connectors as API wrappers have a major benefit compared to for instance the also available http connectors in Power Platform that would allow a Business Technologist to for instance call OpenAI directly with a personal developer account created and a personal API key applied. For those interested: There would be ways to control this behavior as well without for instance blocking the http connector in general, though it´s a lot more effort to do so and you may need additional security mechanism offered by underlying Azure security foundation.

And to quote Satya another time:

As we move into this new era, all of us who build AI, deploy AI, use AI, have a collective obligation to do so responsibly.

Satya Nadella – The Future of Work: Reinventing Productivity with AI

The rise of the co-piloted Business Technologist is ongoing and can be seen across the Power Platform in total. It spans from making it easy to keep data at the center of every application. Describing your application’s purpose and a data table + the app interface is automatically generated by Copilot in Power Apps.

Overview of GPT powered capabilities inside Power Apps

It extends when modernizing or creating new business processes with the help of Power Automate increasing efficiency by automating the creation of your flows using natural language and describe a flow you would like to build

Overview of GPT powered capabilities in Power Automate

And of course, it doesn´t stop with these main tools often times used in daily practice by Business Technologists. It continues building conversational bot in minutes, again leveraging the in-studio copilot that uses generative AI to build and refine topics through natural language.

Overview of GPT powered capabilities in Power Virtual Agents

You think that´s everything Business Technologists deal with? No, there´s an even more powerful tool inside the Power Platform family called AI Builder. And that allows for a lot more capabilities to enhance productivity across all artifacts being generated by them.

Overview of GPT powered capabilities via AI Builder

Business Technologists will adapt AI faster and quicker than traditional software development could pick-up in the past. This is good – not bad. The extension of developers is needed to fulfill on the requirement of apps, processes, artifacts in future. I hear a lot of my network talking about them fixing all those artifacts being created by using AI – if they might not be wrong with this due to the fact of AI learning faster and quicker due to the overall power behind it. Does it eliminate traditional software development – yes. Does it eliminate software development in total – of course not!

I wrap up this post by a recommandation – reading the announcement of Copilot by Charles Lamanna. Including the capability of watching some of the features outlined in small video sequences. But also in terms of reading about responsible AI in general from Microsoft, something I referred to in late 2022 in this article. Soon to share more about the evolution of Business Technologists. Until then,…

Power Platform | Extend your Environment strategy by data policies

I was recently talking and sharing experiences around the importance of having a scaleable environment strategy these days. This due to new features being added by the product team that may become part of specific company requirements, such as the recent EU Data Boundary. A Power Platform environment being a space to store, manage, and share your organization’s business data, apps, chatbots, and flows. It also serves as a container to separate use-cases that might have different roles, security requirements, or target audiences. Based on this definition we could easily understand environments being the foundation of your data strategy as well. So why do I come up with this topic?

By 2025, 70% of orgs will need to shift their focus from big data to small and wide data in order to take full advantage of their available data sources

Gartner via Twitter

We have to understand that with each environment in Power Platform there´s data floating around, but also being actively part of an environment (yours using Dataverse or Azure Synapse Link for Dataverse). So you can imagine if we take Gartner´s prediction for granted to focus from big data to small and wide data, we should think of Dataverse being a central part of this focus and therefore an essential for your environment strategy containing data policies.

Power Platform Admin Center – DLP Policies dialog

Talking about Data policies you may first think of the Power Platform Admin Center interface and setting up your policies. Those can be on Tenant level, or individual (on Environment level) based policies. There´s been a lot of improvements made from the UI perspective, such as not only containing the pre-built connectors, instead allowing the control of custom connectors as well. The visual above outlines this is the dialog path to follow.

DLP Policies – Dialog – Custom connectors

This can become quite flexible though intense work managing a couple of environments inside your tenant and understanding the various combination of policies in practice. In fact, I´ve seen many times the DevOps team being contacted because of a DLP policy blocking a Maker´s work from being saved or executed.

Of course you could now consider deploying an environment each time you do have special requirements in terms of either a prebuilt- or a custom connector, but that could easily end up in a couple of environments. So what´s the alternative here? Let´s say you have a shared environment where you set a combo of DLP policies to work in. Let´s continue the journey by saying there´s a special team acting in that environment that would love to use one of the SAP connectors. As the SAP data should be used to enrich existing data from Dataverse you could now consider to modify the DLP policies and allow for using the SAP connector. But you don´t want this to be used by other users of this environment. You are kind of in a dilemma, aren´t you?

Well, there´s an almost unknown feature of DLP policies. What? You´ve missed something? Well, did you know there´s the possibility of DLP resource exemption. It´s available only via PowerShell cmdlet at the moment and I really would love to see this feature becoming available via the UI instead. In the previous link shared, you´ll also find a given example for an app and a flow that you do want to launch by becoming DLP exempt.

Given this example, you can imagine that your data strategy can become both flexible though governed in granular ways. Let me know in your comments, if you´ve been using this DLP feature via PowerShell and your experiences with it. Until then,…