ALM adoption maturity model

Use the levels and capabilities in the Power Platform adoption maturity model to evaluate your organization's usage of application lifecycle management (ALM) and how the ALM Accelerator for Power Platform can help.

Quick start

Diagram of a decision tree for evaluating ALM.

The ALM Accelerator is a good match when you can verify that the following considerations apply to your organization.

  1. What is the impact of the solution?

    • Are the applications classified as critical or high business impact?
    • Who is using the application?
      • Is it a productivity application used by everyone in your organization?
      • Is it used by senior leadership to make business-impact decisions?
      • Is it integrated with external users; for example, partners and customers who rely on the solution as part of an external process?
    • Are there specific compliance and auditing needs?
      • How important is tracking and auditing who is using the application?
      • Does the solution have compliance and auditing requirements?
  2. Do you have the prerequisites in place?

    • Do you have a defined environment strategy for development, validation, test, and production?
    • Do you use Azure DevOps Services or can you integrate with Azure DevOps Services for source control and build pipelines?
    • Do you have license prerequisites in place?
      • Do you have Basic or above Azure DevOps licenses for makers?
      • Do you have Per App or Per User Power Apps licenses to access the ALM Accelerator maker solution?
  3. Are you looking to move to Level 300 - Defined or beyond?

  4. Do you have an environment strategy in place?

    • What is the request strategy for environments?
      • Do you have a process to create development, test, and production environments?
      • Is the process to request environments automated?
  5. Do you understand and have you implemented source control concepts?

  6. Are fusion development teams engaged to include low-code and pro-code teams?

  7. Are your professional development teams familiar with branching and merging strategies and able to assist makers?

  8. Do your pro-code and operations teams manage Azure Pipelines?

  9. Are you a pro-code team creating components in JavaScript?

  10. Are you integrating with OpenAPI-enabled Web APIs?

  11. Are you using or planning to create plug-ins to extend business processes in Microsoft Dataverse?

  12. Do you have a support plan?

    • Who is supporting the application and solution?
    • Do you have a formal support team to manage issues with the solution?

Leading questions

If the following leading questions reveal gaps, consider what proactive steps you can take to help grow the maturity of your people, processes, and technology to move toward advanced maker integration inside your organization.

Alignment

  1. Is innovation driven by business areas, from the bottom up?

  2. Is there a common vision between IT and business?

  3. Is there a dedicated Power Platform product owner?

  4. Is there an established Center of Excellence team?

  5. Is Power Platform a key part of your digital transformation strategy?

Impact

  1. Is Power Platform targeting low-complexity scenarios?

  2. Is there limited reuse of common components and services?

  3. Do applications allow bottom-up and top-down innovation?

  4. Do applications focus on increased delivery efficiency, supporting rapidly changing business needs?

  5. Are there organization-wide initiatives to deliver large-scale integrated apps?

Strategy

  1. Is your Power Platform strategy defined?

  2. Is there a demand management process in place?

  3. Is there a defined understanding of the role of Power Platform in your organization's IT portfolio?

  4. Are business plans shared across departments?

  5. Are vision and strategy understood by all?

  6. Do enterprise architecture decisions include Power Platform capabilities?

Business value and business strategy viability

  1. What business outcomes will this solution realize?

  2. What is the expected time frame?

  3. What do you do well today?

  4. What do you want to do better?

  5. What thing do you want to do differently?

Technological viability

  1. What are manual steps vs automated steps?

  2. How measurable are the qualitative and quantitative outcomes?

  3. What is the dashboard and report capability to allow stakeholders to visualize and drill into and track action on data?

  4. How available are analytics?

  5. On what frequency are analytics updated?

  6. How frequently are changes required?

  7. What is the technical debt that needs to be accounted for?

  8. What are the security implications?

Financial viability

  1. What is the economic value added?

  2. Does this address the current market model or is a new model being developed?

  3. What is the time horizon for implementation?

  4. What investment model is required?

Business impact

Critical

The system severely affects production, operations, or deployment deadlines or production or profitability. Multiple users or services are affected.

Initial response time is less than 60 minutes with 24x7 access.

Issues demand an immediate response and require 24x7 operation, every day.

High

The system has moderate business impact and can be dealt with during business hours. Multiple users, single users, or customers are affected.

Initial response time is one hour, four hours, or next-day business hours, with support available 24x7.

Noncritical

The system has minimal business impact. The issue is important but doesn't significantly affect service or productivity. Acceptable workarounds are considered.

Initial response time is four to eight hours or greater, with business hours access and support.

Administration and governance

  1. Who can create environments?

  2. What data loss prevention (DLP) policies are in place?

  3. Do Power Platform Service Admin roles exist to administer Power Platform tenants and environments?

  4. Are tenants and environments isolated from each other?

  5. Is there monitoring in place?

  6. Are custom environments used for specific use cases and ALM scenarios?

Support

  1. Are apps created by makers supported by a help desk or dedicated team?

  2. Has an application and solution risk profile been defined that details what level of support will be received?

  3. Is there an ongoing continuous improvement plan for the application?

  4. Are there clearly defined roles and responsibilities for the solution?

  5. Do the roles and responsibilities include ownership to build and operate the solution?

Nurture citizen developers (makers)

  1. Do you have a training and upskilling program for your makers to help them learn key concepts to grow your pool of makers?

  2. Do you have an internal champions community established?

  3. Have you adopted the CoE Starter Kit – Nurture module?

  4. Do you have show-and-tell sessions to demonstrate advanced maker concepts?

  5. Do you have an adoption campaign to demonstrate how fusion development processes work?

  6. Do you have a career path option for makers?

  7. Have you built a community of mentors to share advanced maker concepts and best practices?

  8. Do you have a common development strategy and goals for citizen and professional developers?

Automation

  1. Do you have environment and DLP connector policy requests that are automated?

  2. Do you have communication about processes and compliance between admin and makers? Is this process automated?

Fusion teams

  1. Do you have standard libraries, custom connectors, and components to be consumed by makers?

  2. Do you have the need for fusion teams to manage source control and app lifecycle; for example, build, verification, test, and production?

  3. Do you have cross-functional teams that plan and execute work jointly, including makers, testers, and operational teams?

  4. Do you have a common development strategy and goals for citizen and pro developers needed for new projects?