May 29, 2025

Streamline SPFx Builds with Azure DevOps CI/CD Pipeline – Part 1: Setting Up Continuous Integration

Introduction

If you're developing SharePoint Framework (SPFx) solutions, streamlining your build process is key. In this post, we’ll walk through setting up a Continuous Integration (CI) pipeline using Azure DevOps to automatically prepare your SPFx solution whenever you push code.


Step 1: Set Up the Repo

  • First, create a DevOps repository and add your SPFx source code into a dedicated folder.


Step 2: Create the CI Pipeline

  • Navigate to Pipelines: In Azure DevOps, go to the Pipelines section and click Create Pipeline.

  • Classic Editor: Choose the classic editor for easier configuration.

  • Repo & Branch: Select your repository and branch, then continue.

 

Step 3: Configure Pipeline Jobs

  • Empty Job: Choose the empty job template.
  • Add Tasks:

    1. Node.js Tool Installer : Set the Node.js version required for your SPFx project.
    2. NPM Install: Add the npm task to install dependencies (install command).
    3. Gulp Clean: Add the gulp task with clean.
    4. Gulp Build: Repeat the above for build.
    5. Gulp Bundle: Add another gulp task for bundle.
    6. Gulp Package Solution: Finally, a gulp task for package-solution

    Step 4: Handle Artifacts

    • Copy Files: Add a task to copy the generated .sppkg file from the solution folder to the drop folder.

    Step 5: Publish Artifacts

    Add a task to publish the pipeline artifacts for the next stage.

    Step 6: Set Up Triggers

    In the trigger settings, define the source code path. This ensures that every commit triggers the pipeline automatically.


    Conclusion:

    With this setup, every push to your repository will automatically trigger the CI pipeline, ensuring your SPFx solution is built and packaged consistently.

    Next Up: Part 2 – Deploy SPFx Solution to SharePoint App Catalog using Azure DevOps, we’ll cover how to automate the deployment of the .sppkg file to your SharePoint App Catalog as part of a Continuous Deployment (CD) pipeline.

    If you have any questions you can reach out our SharePoint Consulting team here.

    Streamline SPFx Builds with Azure DevOps CI/CD Pipeline – Part 2: Automating Deployment (CD)

    Introduction

    In the First Part, we built a Continuous Integration (CI) pipeline to automatically package our SharePoint Framework (SPFx) solution. Now it’s time to automate deployment across multiple SharePoint sites using Continuous Deployment (CD).

    This blog covers setting up a CD pipeline in Azure DevOps, triggered automatically after your CI pipeline finishes, to deploy your .sppkg file to all the required sites. 


    Step 1: Prepare Your PowerShell Deployment Script

    We’ll use PowerShell with PnP.PowerShell to handle the deployment. This script connects to SharePoint Online, uploads the .sppkg package, and publishes it on each target site.

    Here's the script:

      Param (  
       [string] $RootPath = $(Throw "Root Path is required."),  
       [string] $ClientId = $(Throw "ClientId is required."),  
       [string] $TenantId = $(Throw "TenantId is required."),  
       [string] $TenantURL = $(Throw "Tenant URL is required."),  
       [string] $ClientSecret = $(Throw "Client Secret is required."),  
       [string] $packagename = $(Throw "Package name is required.")  
     )  
     $SiteURL = "$($TenantURL)/sites/TestSite"  
     $ModernPOPAppPath = "$RootPath\drop\$packagename"  
     $currentPOPSite = ""  
     function ReconnectPNPConnection {  
       Write-Host "Reconnecting to site $($currentPOPSite)"  
       Disconnect-PnPOnline  
       Connect-PnPOnline -Url $currentPOPSite -ClientId $ClientId -ClientSecret $ClientSecret -Tenant $TenantId  
     }  
     function addCustomSPFxApps {  
       $appCatalog = Get-PnPSiteCollectionAppCatalog -CurrentSite -ErrorAction SilentlyContinue  
       while ($null -eq $appCatalog) {  
         Write-Host "Waiting for app catalog creation..."  
         Start-Sleep -Seconds 30  
         $appCatalog = Get-PnPSiteCollectionAppCatalog -CurrentSite -ErrorAction SilentlyContinue  
       }  
       $uploadApp = $false  
       while ($uploadApp -eq $false) {  
         try {  
           Write-Host "Uploading SPFx app..."  
           $App = Add-PnPApp -Path $ModernPOPAppPath -Scope Site -Overwrite -Timeout 900  
           Write-Host "Publishing SPFx app..."  
           Publish-PnPApp -Identity $App.ID -Scope Site -SkipFeatureDeployment -ErrorAction SilentlyContinue  
           Write-Host "SPFx app deployed successfully."  
           Start-Sleep -Seconds 60  
           $uploadApp = $true  
         } catch {  
           Write-Host $_.Exception.Message  
           Write-Host "Retrying after 30 seconds..."  
           Start-Sleep -Seconds 30  
           ReconnectPNPConnection  
         }  
       }  
     }  
     Write-Host -f Cyan "Installing PnP.PowerShell module..."  
     [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12  
     Install-Module PnP.PowerShell -RequiredVersion 2.4.0 -Force -Scope CurrentUser  
     Import-Module -Name "PnP.PowerShell"  
     Write-Host "Connecting to $($SiteURL)..."  
     Connect-PnPOnline -Url $SiteURL -ClientId $ClientId -ClientSecret $ClientSecret -Tenant $TenantId  
     $currentPOPSite = $SiteURL  
     addCustomSPFxApps  
     Write-Host "Disconnecting from $($SiteURL)..."  
     Disconnect-PnPOnline  
    

    We need to upload this PowerShell script to a document folder as shown below and add a step in the CI (Continuous Integration) pipeline to copy this file into the artifacts.


    Step 2: Set Up the CD Pipeline in Azure DevOps

    1. Create the Release Pipeline:

    • Navigate to the Release section in Azure DevOps.
    • Click New Pipeline.

    2. Add Artifacts:

    • Link your CI pipeline’s artifacts (the .sppkg file and related files).test

    3. Enable Continuous Deployment Trigger:

    • Click the lightning bolt icon next to the artifact and enable the trigger to deploy automatically after CI.

    4. Add a Stage:

    • Click Add a stage and choose Empty Job.

    5. Add PowerShell Task:

    • In the stage, click to add a task.
    • Choose PowerShell.
    • Set the script path to point to your deployment script (e.g., Documents/CDScript.ps1).
    • Set the necessary parameters (RootPath, ClientId, TenantId, etc.).

    6. Save and Deploy:

    • Save the pipeline and test it by running the full CI/CD process.

    Conclusion

    By integrating both CI and CD pipelines in Azure DevOps, you’ve established a fully automated and reliable deployment process for your SPFx solutions:

    • CI Pipeline: Automatically builds and packages your SPFx solution on every code push.
    • CD Pipeline: Seamlessly deploys the package to all required SharePoint sites immediately after a successful build.

    This setup not only saves time but also ensures consistency and reduces the risk of manual deployment errors across environments.

    If you have any questions you can reach out our SharePoint Consulting team here.

    How to Use Postman’s Postbot for AI-Powered API Testing

    Introduction

    Postman’s Postbot, an AI-powered assistant, revolutionizes this process by automating and enhancing API testing using intelligent, context-aware suggestions.

    Postbot AI assistant features:

    1. Test Script Generation
    2. Fixed Broken Tests
    3. Expert Debugging
    4. Intuitive Prompt Response
    5. Smart Suggestions
    6. Visualise Responses
    7. Add More Tests
    8. Save a field from response
    9. API Documentation
    10. Test Suite Collection
    11. Get Help

    1. Test Script Generation

    Postbot is an intelligent assistant in Postman that helps you generate automated test scripts for your API endpoints with ease.

    Steps to generate Test Scripts with Postbot:

    • Open Postman
    • Navigate to Script Tab > Click on "Ask PostBot" icon
    • Type a natural language prompt like: "Test if the response status is 200"; "Verify whether the response body has the attribute "token"; "Validate that the array length is 10"; "Ensure the response time is under 500ms"      
    • Postbot automatically generates the relevant JavaScript code for your test case and inserts it directly into your test script area, ready for review and execution.
    • Click “Save” and Run the Request: Postbot’s script will run automatically each time you send the request. 


    2. Fixed Broken Tests

    Postbot AI is a smart assistant in Postman that not only helps generate test scripts—but also diagnoses and fixes broken or incorrect syntax. When test assertions fail, Postbot steps in to identify syntax or logic issues and suggest corrections, saving valuable debugging time.

    Steps to Auto-Fix Script Errors with Postbot:

    • Open Postman
    • Navigate to "Script" tab and write any script (intentionally include a syntax error to test).
    • Run the request by clicking Send. If there's a syntax or runtime error, Postman will highlight it in the test results area.
    • Observe the “Fix Script” Icon. You’ll see a “Fix Script” button/icon next to the error message. This appears when Postbot detects issues it can resolve.
    • Click on “Fix Script". Postbot will analyze the error and automatically suggest or apply a fix. You can review and accept the changes or modify them as needed.

    3. Expert Debugging

    Postbot is available to assist when unexpected errors occur during request execution. We can select 'What's wrong?' in the error message. Any issues that Postbot detects will be reported to you, along with potential fixes.

    Expert Debugging by Postbot empowers developers and testers to:

    • Diagnose failing test scripts
    • Identify syntax errors and runtime issues
    • Automatically fix broken test logic
    • Understand what went wrong and why
    • Receive step-by-step guidance on resolving complex issues

    4. Intuitive Prompt Response

    An intuitive prompt response refers to a system (like Postbot or any AI assistant) that can understand natural, user-friendly input and respond in a way that feels clear, relevant, and context-aware, even if the input is informal or non-technical.

    An intuitive prompt response means you can type and get the output:

    “Check if the response includes an email field and status is 200”

    pm.test("Status is 200", () => {
      pm.response.to.have.status(200);
    });

    pm.test("Response has email", () => {
      const jsonData = pm.response.json();
      pm.expect(jsonData).to.have.property("email");
    });

    5. Smart Suggestions

    Postman's Postbot uses a sophisticated artificial intelligence language model to provide intelligent, context-aware code recommendations right in the test editor,  
    helping developers write effective API tests faster and with greater accuracy.
    How Postbot Enhances Test Writing?
    • Real- Time Code Suggestions: As developers begin typing test scripts in the Tests tab, Postbot proactively suggests common test behaviors, helping avoid boilerplate repetition and syntax errors.
    • Response-Aware Completions: Postbot analyzes the live API response or any saved response examples to generate customized test suggestions that are tailored to the structure and content of the response.
    • Function Name-Based Proposals
    • Data-Driven Test Recommendations
    • Boosts Productivity

    6. Visualise Responses

    We have option to see the results in more intuitive way which is more clear to understand.

    • Visualise response as a table : Can see the response result in tabular form
    • Visualise response as line chart : Can see the response result as line chart
    • Visualise response as a bar chart : Can see the response result as bar chart


    7. Add More Tests

    Add more Test features helps users with some new testcases which would be relevant to current testing.
    • Open any request in Postman.
    • Navigate to the “Script” tab.
    • Click on the “Add More Tests” button (usually found below the test editor).
    • Choose from common test templates or use Postbot to generate custom tests
    • You can ask in plain language (e.g., "Add a test to check if the response time is under 500ms").
    • Click Insert to add the suggested test to your script area.

    8. Save a Field From Response

    You're calling an API, and you want to save a field from the response (e.g., token, userId, orderId) for reuse in:
    • Another request
    • Headers or body
    • Conditional logic in tests

    Steps to save a field from response:

    • Send API request to get a valid response.
    • Navigate to the “Script” tab under the request.
    • Click “Ask Postbot”.
    • Type a prompt like: "Save the Token field from the response to an environment variable"; "Extract userId and store it as a collection variable"
    • Postbot will generate a script for you

    9. API Documentation

    Steps to create API documents:
    • Open Postman and go to your Collection or a specific Request.
    • Navigate to the Documentation section (either in the request pane or via the collection sidebar).
    • Click “Ask Postbot” in the documentation field.
    • Use natural prompts like:  “Write a detailed description of this GET request”; “List the headers and body of this API call in brief.”;  “Add usage examples with response format”; “Generate markdown documentation for this API”
    Postbot will populate or enhance the documentation accordingly. You can review and edit before publishing.

    10. Test Suite Collection

    Steps to create a Test Suit Collection:
    • Open Postman and go to your Collections tab.
    • Click on an existing collection or create a new one.
    • Click on the three-dot menu, next to the collection or folder name and then click on “Generate Tests with Postbot”.
    • In the Postbot sidebar, type a prompt like: “Generate basic tests for all requests in this collection”; “Create a test suite to validate response status, body fields, and response time”; “Add error response validation for each request”

    Postbot will:

    • Add relevant tests to the Tests tab of each mentioned request
    • Suggest test case names and logic
    • Optionally handle parameterized test data (e.g., from CSV/JSON files) 

    11. Get Help

    When you're stuck, there's no need to leave Postman or search the web — Postbot is your in-app AI assistant, ready to help you with contextual, intelligent, and fast answers based on Postman's official Learning Center and trusted community sources.
    • Expertly Curated Instructions: Postbot pulls relevant information from official documentation, tutorials, and FAQs to provide accurate and relevant support, right when you need it.
    • No More Searching: Instead of jumping to external forums or search engines, simply ask Postbot your question directly inside Postman.
    • Context-Aware Help: Whether you're writing tests, debugging scripts, or visualizing responses, Postbot understands your context and offers tailored help.

    Conclusion

    Postbot is more than just a code generator - it’s your AI test partner inside Postman. Whether you're building new APIs, maintaining old ones, or scaling test coverage across teams, Postbot transforms the way you approach API quality.

    If you have any questions you can reach out our SharePoint Consulting team here.

    May 22, 2025

    Step-by-Step Guide: Create a PCF Control to Record Video in Power Apps

    Introduction

    In Power Apps, the default controls are often enough for typical business needs, but what if you need a more customized user interface or logic that goes beyond the built-in features? That’s where the PowerApps Component Framework (PCF) comes into play.

    PCF enables developers to create custom components using modern web technologies like HTML, CSS, and TypeScript. These components can be reused just like standard Power Apps controls but offer much greater flexibility and power.
     
    With PCF, you can build: 

    • Rich visual components like sliders, charts, or custom input fields
    • Controls that connect to external data sources
    • Reusable UI elements across different apps and environments

    Whether you're enhancing user experience or integrating third-party libraries, PCF gives you the tools to take your apps beyond low-code.

    In this blog, we’ll walk you through creating a custom PCF control that records video directly within a Power Apps, perfect for scenarios like field service reporting.
     
    Prerequisites
    Before you begin, ensure you have the following tools and knowledge:
    • Node.js (LTS version)
    • Power Platform CLI (pac)
    • Visual Studio Code
    • A Microsoft Power Apps Developer Environment

    Use Case: Record Video in Power Apps
    Imagine a field engineer needing to quickly document a machine fault. Instead of recording a video externally and uploading it later, a custom PCF control enables the engineer to record and save the video directly within the mobile app, streamlining the entire submission process.


    Step-by-Step Guide to Creating the PCF Control

    1. Create the PCF Control

    • Open Visual Studio Code and run:

    pac pcf init --namespace VideoRecoder --name VideoRecoder --template field 

    • Then install the dependencies:

     npm install

     2.  Add Video Recording Logic (in index.ts)

    • Use the MediaRecorder API to access the camera and record video.

    private async startRecording(): Promise<void> {
            this._errorElement.innerText = "";
            try {
                // Request back camera with facingMode: "environment"
                const stream = await navigator.mediaDevices.getUserMedia({
                    video: { facingMode: "environment" },
                    audio: true
                });
                this._videoElement.srcObject = stream;
                this._videoElement.play();

                this._mediaRecorder = new MediaRecorder(stream);
                this._recordedChunks = [];

                this._mediaRecorder.ondataavailable = (event) => {
                    if (event.data.size > 0) {
                        this._recordedChunks.push(event.data);
                    }
                };

                this._mediaRecorder.start();
                this._startButton.disabled = true;
                this._stopButton.disabled = false;
                this._uploadButton.disabled = true;
            } catch (error) {
                this._errorElement.innerText = "Failed to access camera/microphone. Please
    ensure permissions are granted.";
                console.error("Recording error:", error);
            }
        }

        private stopRecording(): void {
            if (this._mediaRecorder && this._mediaRecorder.state !== "inactive") {
                this._mediaRecorder.stop();
                this._videoElement.srcObject = null;
                this._startButton.disabled = false;
                this._stopButton.disabled = true;
                this._uploadButton.disabled = false; // Enable upload button after stopping
            }
            console.log("from stopRecording() test");
        }


    You can view the complete source code here.
    • Run the development server to test locally:
    npm start

    3. Build the PCF Control into a Solution
    • Create a new solution folder and initialize it:
    mkdir VideoRecoderSolution
    cd VideoRecoderSolution
    pac solution init --publisher-name YourPublisher --publisher-prefix yourprefix
    Then add the reference and build:
    pac solution add-reference --path ..\..\
    dotnet build
    • A .zip file will be created inside VideoRecoderSolution > bin > Debug

    4. Import the Solution into Power Apps

    • Open your Power Apps portal
    • Navigate to Solutions

    • Click Import Solution and select the generated .zip file


    • Click Next > Import
    Once imported
    • Go to your app
    • Click Insert > Get More Components

    • Click on code and select your custom component
    • Click Import, and it will be available under Code Components


    Conclusion

    With PCF, you unlock a whole new level of customization in Power Apps. In this tutorial, we created a practical video recording control - ideal for industries like field services, insurance, or any scenario where direct media input is valuable.


    If you have any questions you can reach out our SharePoint Consulting team here.

    Automate Azure Logic App Deployment with Azure DevOps CI/CD and ARM Templates – Step-by-Step Guide

    Managing Azure Logic Apps in various environments can be a tedious and error-prone assignment when relying on manual deployments. Thankfully, utilizing Azure DevOps CI/CD pipelines with ARM templates allows for complete automation of Logic App deployments, removing the need for manual involvement and guaranteeing uniform infrastructure-as-code releases.

    This blog walks you through setting up a complete automated deployment pipeline for Logic Apps, from exporting templates to running parameterized deployments using YAML pipelines.

    Prerequisites:
    Before we begin, make sure you have the following in place:

    • An existing Logic App in your Azure subscription.

    • An Azure DevOps project with permissions to create service connections and pipelines.

    • Basic familiarity with ARM templates and YAML


    Step 1: Export the ARM Template of the Logic App
    • Go to the Azure Portal and navigate to your Logic App.

    • In the left pane, select Automation > Export template.

    • Click Download to save the template.json and parameters.json files.

    These files define the infrastructure and configuration of your Logic App and are essential for reproducible deployments.


    Step 2: Create a Service Connection in Azure DevOps
    To allow Azure DevOps to deploy resources to Azure, you need a service connection:

    • Go to your Azure DevOps project > Project settings > Service connections.

    • Click New service connection > choose Azure Resource Manager.

    • Select Default, Identity Type > App registration (automatic).

    • Select Default, Credential > Workload Identity Federation.

    • Choose Scope Level > Subscription.

    • Choose your Azure subscription and resource group.

    • Name the service connection (e.g., AzureServiceConnection) and save.

     

    Step 3: Create a Repository and Add Template Files
    • Create a new Git repository in Azure DevOps or use an existing one.

    • Commit the downloaded files template.json and parameters.json into the repository.

    • Arrange files within a specific folder structure when handling various Logic Apps or environments.

     

    Step 4: Create a Variable Group (Optional but Recommended)
    To support environment-specific deployments, create a variable group:

    • Go to Pipelines > Library in Azure DevOps.

    • Create a new variable group.

    • Add variables to replace parameters, connections, or static values, such as:
         
    1. LogicAppName

    2. Location

    3. SubscriptionId

    4. ResourceGroup

    • After creating the pipeline, grant it permission to access the variable group. Alternatively, you can configure the variable group to be accessible by all pipelines within the project by enabling the “Open Access” option.
       
    • Alternatively, you can define variables directly in your YAML pipeline using variables: if you want to keep everything within the pipeline file.

     

    Step 5: Create a YAML Pipeline
    Now, create a YAML pipeline to:

    • Replace parameters dynamically.

    • Deploy the Logic App using ARM templates.

    • Use variables for flexibility across environments.

    Here’s a sample pipeline (azure-pipelines.yml).

    trigger:
        branches:
            include:
            - main
    
    stages:
      - stage: Build
        displayName: "Build And Deploy"
        jobs:
          - job: BuildLogicApp
            displayName: "Deploy Logic App to Azure"
            pool:
              vmImage: "windows-latest"
    
            variables:
              # Store the variables in the group for dynamic use
              - group: TestLogicApp-Variables
    
            steps:
              - checkout: self
    
              # Preprocess template.json to replace hardcoded values
              - task: PowerShell@2
                displayName: "Preprocess ARM Template"
                inputs:
                  targetType: "inline"
                  script: |
    
                    $templatePath = "$(Build.SourcesDirectory)/template.json"
    
                    $jsonRaw = Get-Content $templatePath -Raw | ConvertFrom-Json
    
                    # Update LogicApp name parameter
                    if ($jsonRaw.parameters.workflows_TestLogicApp_01_name) {
                        Write-Host "Updating Logic App name to: $(LogicAppName)"
    
                        $jsonRaw.parameters.workflows_TestLogicApp_01_name.defaultValue = "$(LogicAppName)"
                    } else {
                        Write-Host "Error: workflows_TestLogicApp_01_name parameter not found!" -ForegroundColor Red
                        exit 1
                    }
    
                    # Update location parameter
                    if ($jsonRaw.resources[0].location) {
                        Write-Host "Replacing WorkFlow location defaultValue with $(Location)"
                        $jsonRaw.resources[0].location = "$(Location)"
                    } else {
                        Write-Host "Warning: Location parameter not found in resoures!" -ForegroundColor Yellow
                    }
    
                    # Update the connections properties
                    if ($jsonRaw.resources[0].properties.parameters.'$connections') {
                        $connections = $jsonRaw.resources[0].properties.parameters.'$connections'.value
                        if ($connections.sharepointonline) {
                            # Replace subscriptionId in the id field
                            $connections.sharepointonline.id = $connections.sharepointonline.id -replace '/subscriptions//', "/subscriptions/$(SubscriptionId)/"
                            Write-Host "Replaced sharepointonline id subscriptionId: $(SubscriptionId)"
                            # Replace location in the id field
                            $connections.sharepointonline.id = $connections.sharepointonline.id -replace '/locations//', "/locations/$(Location)/"
                            Write-Host "Replaced sharepointonline id location: $(Location)"
                            # Update connectionName
                            $connections.sharepointonline.connectionName = "$(SharepointConnectionName)"
                            Write-Host "Replaced sharepointonline connectionName: $(SharepointConnectionName)"
                        }
                    }
                                  
                    # Save the updated template
                    $jsonString = $jsonRaw | ConvertTo-Json -Depth 100 -Compress
                    $jsonString | Set-Content $templatePath -Encoding UTF8
                    Write-Host "Updated template.json with all dynamic values saved to $templatePath"
    
              - task: PowerShell@2
                displayName: "Prepare Exported Templates Directory"
                inputs:
                  targetType: "inline"
                  script: |
                    $exportPath = "$(Build.ArtifactStagingDirectory)/exportedTemplates
    
                    # Copy template.json
                    $templatePath = "$(Build.SourcesDirectory)/template.json"
                    if (Test-Path -Path $templatePath) {
                        Copy-Item -Path $templatePath -Destination "$exportPath/template.json" -Force
                    } else {
                        Write-Host "Warning: template.json not found in source directory"
                    }
    
                    # Copy parameters.json
                    $parametersPath = "$(Build.SourcesDirectory)/parameters.json"
                    if (Test-Path -Path $parametersPath) {
                        Copy-Item -Path $parametersPath -Destination "$exportPath/parameters.json" -Force
                    } else {
                        Write-Host "Warning: parameters.json not found in source directory"
                    }
    
                    Write-Host "Exported templates prepared successfully."
    
              - task: PowerShell@2
                displayName: "Debug Exported Templates"
                inputs:
                  targetType: "inline"
                  script: |
                    Get-Content "$(Build.ArtifactStagingDirectory)/exportedTemplates/template.json" | Write-Host
                    Get-Content "$(Build.ArtifactStagingDirectory)/exportedTemplates/parameters.json" | Write-Host
    
              - task: PublishBuildArtifacts@1
                displayName: "Publish Exported Templates as Artifacts"
                inputs:
                  PathtoPublish: "$(Build.ArtifactStagingDirectory)/exportedTemplates"
                  ArtifactName: "LogicAppExportedTemplates"
    
              - task: DownloadBuildArtifacts@0
                displayName: "Download Published Artifacts"
                inputs:
                  buildType: "current"
                  downloadType: "single"
                  artifactName: "LogicAppExportedTemplates"
                  downloadPath: "$(Pipeline.Workspace)/ExportedTemplates"
    
              - task: PowerShell@2
                displayName: "Debug Downloaded Artifacts"
                inputs:
                  targetType: "inline"
                  script: |
                    Get-ChildItem -Recurse "$(Pipeline.Workspace)/ExportedTemplates"
                    Get-Content "$(Pipeline.Workspace)/ExportedTemplates/LogicAppExportedTemplates/template.json" | Write-Host
                    Get-Content "$(Pipeline.Workspace)/ExportedTemplates/LogicAppExportedTemplates/parameters.json" | Write-Host
    
              - task: AzureResourceManagerTemplateDeployment@3
                displayName: "Deploy Logic App ARM Template"
                inputs:
                  deploymentScope: "Resource Group"
                  azureResourceManagerConnection: ""
                  subscriptionId: ""
                  resourceGroupName: ""
                  location: ""
                  templateLocation: "Linked artifact"
                  csmFile: "$(Pipeline.Workspace)/ExportedTemplates/LogicAppExportedTemplates/template.json"
                  csmParametersFile: "$(Pipeline.Workspace)/ExportedTemplates/LogicAppExportedTemplates/parameters.json"
                  deploymentMode: "Incremental"
    The provided YAML pipeline automates the deployment of a Logic App to Azure using a dynamic, reusable approach. Key features include,

    • Branch Trigger: The pipeline runs when changes are pushed to the main branch.

    • Variable Group Usage: A variable group (TestLogicApp-Variables) is linked to the pipeline to store dynamic values like LogicAppName, Location, SubscriptionId, etc. These are accessed using the $(varName) syntax for flexibility and easy updates without modifying the template.

    • ARM Template Preprocessing: The template.json is dynamlically updated using PowerShell to inject values from the variable group, replacing placeholders such as Logic App name, location, and connection settings.

    • Template Export and Debugging: The processed templates are saved, exported as build artifacts, and verified via debug steps to ensure correctness.

    • Deployment: The AzureResourceManagerTemplateDeployment@3 task deploys the Logic App ARM template to the specified Azure resource group using values from the variable group.

    This modular and dynamic approach ensures reusability across environments and simplifies Logic App deployment directly from your DevOps workflow.

    Step 6: Create and Run the Pipelines
    • Go to Pipelines > New Pipeline.

    • Select Azure Repos Git > choose your repo.

    • Select YAML and point to your YAML file.

    • Save and run the pipeline.

    Your Logic App will now be deployed automatically with environment-specific values.


    Final Thoughts
    Utilizing ARM templates alongside Azure DevOps pipelines for Logic App deployments provides a secure, scalable, and repeatable method that aligns with contemporary DevOps methodologies. By adhering to the previously described steps, you can eradicate manual deployment mistakes, standardize your deployment process, and fully version control your Logic App infrastructure.

    Furthermore, you can adjust various parameters and connection settings dynamically by utilizing PowerShell scripts in the pipeline, as shown in this blog. This method offers enhanced flexibility for deployments in diverse environments or for making real-time updates to configuration values.

    If you have any questions you can reach out our SharePoint Consulting team here.

    Understanding Low, Mid, and High-Fidelity Layouts in UI/UX Design

    Introduction:

    In the world of UI/UX design, every great product begins with a blueprint - a visual guide that brings ideas to life before a single line of code is written. These blueprints, often referred to as design layouts or wireframes, come in varying levels of detail: low-fidelity, mid-fidelity, and high-fidelity. Each plays a distinct role in the design process, from capturing rough ideas on paper to creating polished, interactive prototypes ready for development. In this blog, we’ll break down the characteristics, purposes, and tools of each layout type to help you understand when and why to use them in your design workflow.


    1. Low-Fidelity Layout

    Meaning:

    • Very simple, rough sketches.
    • Focus on structure and flow, not visuals.
    • Often black-and-white, with boxes, lines, and placeholder text like "Image Here" or "Button".

    Purpose:

    • Quick exploration of ideas.
    • Getting early feedback.
    • No focus on branding, typography, or detailed design.

    Tools used:

    • Paper and pen.
    • Basic digital tools like Balsamiq, Figma (wireframe mode), or Whimsical.

    Looks like:

    • Hand-drawn or very basic shapes.
    • Labels instead of real images.
    • No colors, shadows, or textures.


    2. Mid-Fidelity Layout

    Meaning:

    • More detailed than low-fidelity.
    • Basic interactions and flow logic are shown.
    • Better typography and some design elements (like buttons, menus) but still no polished visuals.

    Purpose:

    • Validate usability.
    • Start testing navigation and layout.
    • Communicate more clearly with developers or stakeholders.

    Tools used:

    • Figma, Adobe XD, Sketch, or wireframe software with mid-level polish.

    Looks like:

    • Clear sections.
    • Greyscale or minimal color.
    • Basic icons.
    • Some real text instead of placeholders.


    3. High-Fidelity Layout

    Meaning:

    • Pixel-perfect design.
    • Full visuals, real images, colors, typography, and UI components.
    • Shows final interactions, animations, and responsive behavior.

    Purpose:

    • Final approval before development.
    • Handoff to developers.
    • Client presentations.

    Tools used:

    • Figma, Adobe XD, Sketch, InVision, Framer.

    Looks like:

    • Real branding.
    • Real images, polished fonts, shadows, gradients.
    • Complete interactivity (if prototyped).

    Conclusion:

    Designing a product isn’t a one-step process - it’s a journey from rough sketches to pixel-perfect interfaces. Low-fidelity layouts help you brainstorm and gather early feedback quickly, mid-fidelity layouts refine structure and usability, and high-fidelity layouts showcase the final vision ready for development. Understanding the differences between these fidelity levels enables clearer communication with stakeholders and a smoother design-to-development handoff. Whether you're just sketching out concepts or presenting to clients, choosing the right level of fidelity at the right stage is key to effective and efficient design.

    If you have any questions you can reach out our SharePoint Consulting team here.