February 5, 2026

Modular Monolith vs Microservices: A Practical Architecture Guide

Software architecture trends often swing like a pendulum. Over the past decade, microservices dominated the conversation, promising scalability, autonomy, and faster delivery. Today, many teams are re-evaluating that choice and rediscovering a more balanced approach: the modular monolith.

This article explores why microservices became the default, where they struggled, and why modular monolithic architecture is emerging as a pragmatic alternative for many teams.

TL;DR: Modular Monolith vs Microservices

Microservices can enable independent scaling and deployments, but they also introduce distributed-system complexity that many teams don’t need. A modular monolith is often a better fit for small to medium teams because it keeps strong domain boundaries inside a single deployable application.

  • Microservices work best for large organizations with high operational maturity and autonomous teams.
  • Modular monolith architecture provides clear module boundaries without network calls between domains.
  • If your main challenge is delivery speed, maintainability, and cost, a modular monolith is often the pragmatic default.
  • You can still evolve: well-designed modules can be extracted into microservices later if scale demands it.

Who This Article Is For

This guide is designed for software teams and engineering leaders who are evaluating modern application architecture and want to avoid unnecessary complexity.

  • Startup and SaaS teams deciding between microservices and monolithic architectures
  • Small to medium engineering teams struggling with microservices operational overhead
  • Technical leads planning long-term scalable system design
  • Organizations considering a transition from microservices back to a modular monolith
  • Developers looking for a practical, maintainable architecture approach

Why Microservices Became the Default Choice

For nearly a decade, microservices were seen as the natural evolution of software architecture. As systems grew and teams expanded, breaking applications into smaller services promised independent deployments, better scalability, and faster delivery.

Microservices also became a signal of technical maturity. Organizations associated them with industry leaders and modern engineering practices, often adopting them early to avoid future rework.

For a time, this approach appeared inevitable. Splitting systems felt like progress.

Common Myths About Microservices

Microservices are often presented as a universal solution for modern software systems. In reality, many assumptions around them are misleading.

  • Myth: Microservices automatically improve scalability.
    In practice, many applications scale uniformly, meaning the entire system grows together. Microservices only provide real benefits when different components have significantly different scaling requirements.
  • Myth: Microservices speed up development for all teams.
    While they can help large autonomous teams, smaller teams often experience slower delivery due to increased coordination, infrastructure setup, and deployment pipelines.
  • Myth: Monoliths are inherently outdated.
    A well-structured modular monolith follows modern design principles and can be just as maintainable and scalable as distributed systems - without unnecessary complexity.
  • Myth: You must start with microservices to scale later.
    Starting with a modular monolith often provides a cleaner and safer path to microservices when real scaling needs emerge.

Why Microservices Struggle Outside Their Intended Context?

Microservices were designed for a very specific environment: large systems, autonomous teams, and high operational maturity.

Most teams, however, do not operate under these conditions.

In practice, microservices shifted complexity away from code and into infrastructure. Simple changes began to require coordination across multiple services, pipelines, and teams. Deployments slowed, debugging became distributed, and observability turned into a requirement rather than a nice-to-have. Infrastructure costs increased even when scale did not.

Microservices didn't fail as an idea. They struggle when applied outside the context they were designed for creating unnecessary complexity without delivering their intended benefits.

A Real-World Scenario: When Microservices Became a Bottleneck

Consider a growing SaaS product with a small engineering team of six developers. To follow modern best practices, the team adopted a microservices architecture early in the product lifecycle.

Within a year, the system consisted of more than ten services, each with its own deployment pipeline, monitoring setup, and infrastructure configuration. Simple feature changes required updates across multiple services and coordination between developers.

Instead of moving faster, the team spent increasing time managing CI/CD pipelines, debugging distributed failures, and maintaining cloud resources. Infrastructure costs rose, while feature delivery slowed.

After consolidating the system into a modular monolith with clearly defined internal modules, the team reduced operational overhead, simplified deployments, and improved development speed - while still maintaining strong domain separation.

This pattern is increasingly common across startups and small to medium engineering teams.

What is a Modular Monolith?

A modular monolith structures a single application as a collection of internally isolated modules, each aligned with a specific business or functional domain. Instead of allowing unrestricted access across the codebase, modules interact only through explicit contracts, ensuring that related functionality stays together and unrelated concerns remain separated. This intentional structure reduces accidental coupling and leads to a system that is more cohesive, predictable, and resilient to change.

Key Characteristics of a Modular Monolith

A modular monolith is defined not just by its structure, but by the architectural rules it enforces.

  • Single runtime and deployment unit
  • Internally isolated, domain-aligned modules
  • Explicit APIs between modules
  • No direct data access across module boundaries - modules access other modules' data only through explicit APIs, never directly querying each other's tables
  • No network calls between domains
  • Clear ownership and responsibility per module

These characteristics provide service-like boundaries while keeping execution in-process.

Architecture Comparison

Aspect Monolith Modular Monolith Microservices
Structure Tightly coupled codebase Single codebase with strict modules Multiple independent services
Deployment Single Single Multiple
Communication In-process In-process via module APIs Network calls
Data ownership Shared database, no boundaries Single database, logical boundaries per module Database per service
CI/CD complexity Low Low High
Operational overhead Low Low–Medium Very high
Debugging Simple early Predictable Complex (distributed)
Scalability Whole app (vertical/horizontal) Whole app (vertical/horizontal) Independent per service
Best fit for Very small teams Small–medium teams Large organizations

Benefits of Modular Monolithic Architecture

When implemented correctly, modular monolithic architecture offers practical advantages across development, operations, and cost.

  • Strong modular boundaries: Clear separation of domains prevents tight coupling while keeping related logic together.
  • ACID transactions across modules: Cross-module operations can use native database transactions, avoiding the complexity of distributed transaction patterns like sagas or two-phase commit required in microservices.
  • Simple deployment and operations: Single build, deploy, and rollback without distributed-system overhead.
  • Lower complexity, easier debugging: In-process communication enables predictable behavior and straightforward troubleshooting.
  • Faster development and CI/CD: Fewer pipelines and dependencies improve delivery speed and reliability.
  • Future-ready architecture: Well-defined modules can be extracted into microservices when scale or team size demands it.

Challenges of Modular Monolithic Architecture

Despite its advantages, a modular monolith is not without trade-offs.

  • Requires strong discipline: Without strict enforcement of module boundaries, it can degrade into a tightly coupled system.
  • Limited independent scalability: The entire application scales as a unit, though horizontal scaling across multiple instances is fully supported.
  • Shared deployment risk: A bug in one module can impact the whole system.
  • Growing codebase over time: Can become harder to navigate without structure.
  • Boundary enforcement needs tooling: Requires architectural rules and automated checks.

When to Use Modular Monolithic Architecture

Modular monolithic architecture is most effective in the following scenarios.

  • Small to medium-sized teams: Need simplicity with domain separation.
  • Early to mid-stage products: Flexibility matters more than fine-grained scalability.
  • Limited operational maturity: Avoid distributed overhead.
  • Uniform scaling needs: Most components scale together.
  • Systems expected to evolve: Clean path to microservices later.

Frequently Asked Questions (FAQ)

Is a modular monolith better than microservices?

A modular monolith is often better for small to medium-sized teams because it reduces operational complexity while maintaining strong architectural boundaries. Microservices are more suitable for large organizations with high scalability and operational maturity.

Can a modular monolith scale?

Yes. A modular monolith can scale effectively by scaling the entire application. In many real-world systems, most components scale together, making this approach sufficient and simpler than distributed microservices.

When should you choose microservices?

Microservices make sense when different parts of the system require independent scaling, when teams are large and autonomous, and when the organization can handle distributed system complexity.

Is starting with a monolith a bad practice?

No. Starting with a well-structured modular monolith is considered a best practice by many experienced architects. It provides simplicity early on and allows clean evolution into microservices when needed.

How is a modular monolith different from a traditional monolith?

A traditional monolith often lacks clear boundaries and becomes tightly coupled over time. A modular monolith enforces strict module separation, explicit APIs, and domain ownership while remaining a single deployable application.

Final Thoughts

Microservices are not obsolete - but they are no longer the default answer for every system.

For many teams, the real challenge is not scale. It is clarity, ownership, and operational simplicity.

Modular monolithic architecture offers a pragmatic middle ground: strong internal boundaries without unnecessary distributed system complexity.

If you have any questions or need help with CI/CD pipelines, DevOps automation, or cloud solutions, feel free to reach out to our team here.

Power Automate Silent Failures: Why Flows Succeed But Don’t Work (And How to Fix Them)

Introduction

Power Automate is a powerful automation tool. But sometimes, it behaves in a confusing way.

A flow may smile, report Succeeded, and quietly do not do the thing you need.

No errors. No alerts. Just an automation–shaped hole where your business logic should be.

These are called silent failures, and they are some of the most dangerous problems in low-code automation.

In this blog, we will understand:

  • What silent failures are
  • Why they happen
  • How to detect and prevent them
Power automate flows that silently fail image

What is a Silent Failure?

A silent failure happens when:

  • The flow run status is Succeeded.
  • No error message is shown
  • But the expected outcome never happens.

For example:

  • A condition evaluates incorrectly and skips critical steps
  • An action runs but affects zero records
  • An API call returns an empty response without error
  • A loop runs zero times without warning.

Power automate assumes you meant to do that. You did not.

Common Causes of Silent Failures

1. Conditions That Evaluate the “Wrong” Way

Conditions are the most common cause of silent failures.

Common mistakes:

  • Comparing a string with a number
  • Checking for null instead of an empty string
  • Assuming “Yes” equals true
  • Case sensitivity issues

Because of this, the flow goes into the wrong branch and skips important actions.

2. Empty Arrays and Zero-Iteration Loops

Actions like

  • Get items
  • List rows
  • Get emails

Can return zero records without any error.

If you use Apply to each, the loop simply does not run.

No errors. No warnings.

3. Actions That Succeed but Do Nothing

Some connectors report success even when nothing changes.

Example:

  • Updating an item with the ID is wrong.
  • Deleting a record that doesn’t exist
  • Sending an email with an empty “To” field resolved at runtime.

The action succeeds, but the result is missing.

4. Misconfigured Run After Settings

Run after is powerful but risky.

If you configure:

  • Run after has failed
  • Run after is skipped

But forget:

  • Run after has timed out

Then your error handling logic may never run.

5. Expressions That Return Null Silently

Expressions fail quietly when:

  • A property does not exist
  • JSON paths are wrong
  • Dynamic content is missing

Power Automate does not throw an error. It simply continues.

How to Catch Silent Failures

1. Validate Data Before Processing

Before doing anything important, verify assumptions.

Examples:

  • Check array length is greater than zero
  • Confirm required fields are not empty
  • Validate IDs and key exist
  • Use Conditions like: Length(body('Get_items')?['value']) > 0

If the condition fails, terminate the flow intentionally.

2. Use Terminate Actions Strategically

The Terminate action is very useful.

Use it to:

  • Stop the flow when preconditions are not met
  • Mark runs as Failed or Cancelled intentionally
  • Surface logic error early

A failed flow is easier to identify than a silent one.

3. Log What You Expect, Not Just What You Get

Use:

  • Compose
  • Append to string variable

Log details such as:

  • Number of records retrieved
  • Which condition branch was executed
  • Important variable values

This makes troubleshooting easier.

4. Build a Dedicated Error Handling Scope

Wrap critical actions inside scopes:

  • Main logic
  • Error handler

Configure Error Handler to run after:

  • Has failed
  • Has timed out
  • Is skipped

Inside Error Handler:

  • Send an email or Teams notification
  • Log the run ID
  • Capture error details

5. Verify Output After Important Actions

After key actions, verify results:

  • After Update item, check returned ID
  • After Create record, confirm required fields exist
  • After Send email, verify Message ID is not null

If verification fails, terminate the flow.

6. Add “This Should Never Happen” Branches

In conditions:

  • Add an else branch for unexpected values
  • Treat unknown states as errors, not defaults

Silence helps bug hide. Logging exposes them.

Best Practice Tips:

Think of Power Automate as a very literal assistant

It will:

  • Do exactly what you tell it
  • Assume success unless told otherwise
  • Avoid raising errors unless forced

Your job is to:

  • Question assumptions
  • Validate results
  • Make failures visible

Frequently Asked Questions

Why does my Power Automate flow show Succeeded but not work?

This usually happens due to silent failures such as incorrect conditions, empty data returned from actions, skipped loops, or expressions resolving to null without throwing errors.

What is a silent failure in Power Automate?

A silent failure occurs when a flow completes successfully but does not perform the intended action, and no error message is displayed.

How can I detect silent failures in Power Automate?

You can detect silent failures by validating inputs, logging expected outputs, using terminate actions, and implementing proper error handling scopes.

Do empty arrays cause Power Automate flows to skip actions?

Yes. If an action like Get items returns zero records, loops such as Apply to each will not run, and no warning will be shown.

Is error handling important in Power Automate?

Yes. Proper error handling ensures issues are surfaced instead of silently ignored, making flows easier to monitor and debug.

Final Thought

Silent failures are not Power Automate bugs.

They are missing conversations between you and your flow.

Make your flows

  • Chatty
  • Opinions.
  • Loud when something goes wrong

Green checkmarks feel good, but the truth is better.

A noisy failure is better than a silent success.

How to Filter SharePoint Online News Posts in Communication Sites (Step-by-Step Guide)

Introduction

SharePoint Online's News feature has become an essential tool for internal communications, helping organizations share updates, announcements, and stories across their workforce. However, as your news content grows, a common challenge emerges: how do you keep everything organized and ensure employees see only the most relevant information?

Imagine your communication site displaying a mix of company-wide announcements, departmental updates, project reports, and blog posts all in one feed. While comprehensive, this approach can overwhelm users and make it difficult to find specific types of content. Department heads want to see only their team's updates, project managers need quick access to status reports, and employees looking for blog content don't want to scroll through unrelated announcements.

The solution is to filter your News web part by category. By organizing your news posts into distinct types such as:

  • Blogs
  • Departmental updates
  • Project news
  • Company announcements
  • Status reports

You can create targeted news sections that display exactly what your audience needs to see.

In this guide, we'll show you how to filter SharePoint News using the Page Category field—a fully Microsoft-supported method that leverages page properties. This approach requires no custom development, works across multiple sites, and gives you complete control over how news content is organized and displayed throughout your SharePoint environment.

Example of a filtered SharePoint News web part displaying categorized content

Why Use SharePoint Online to Filter News?

You benefit from filtering:

  • Show only relevant News posts.
  • Organize content by categories.
  • Create sections for your blog.
  • Maintain a clean communication site.
  • Improve information discoverability.

You may filter News on any page by using information like Page Category.

Requirements

Before you begin, ensure you have:

  • A SharePoint communication site
  • Site owner or administrator permissions
  • Permission to modify the Site Pages library
  • At least a few news posts already created (for testing the filter)

Step 1: Establish a Page Category Column in the Site Pages Library

  1. Navigate to your Communication Site.
  2. Select Site contents.
  3. Access Site Pages, hover over it to see visible ellipses, then click on it to see “Setting.”
Site Pages → ellipses (…) → Settings
  1. Click Create column and select Choice as the type.
  2. Label the column as "Page Category".
  3. Add choice values such as:
    • Blog
    • Project Report
    • Status Report
    • General News
    • General Page
    • Announcement
    • News
  4. If you want to set a default value, select one; otherwise, leave it blank.
  5. Click the "OK" button to create the column.
Enter the column name, select Choice as the type, add the choice values, set the default value, and click OK.

Important: this column should be established within the Site Pages library – not within any other list or library.

Step 2: Categorize Your News Posts using Page Category

Every News post needs to have a category assigned for filtering purposes.

  1. Open any News Post.
  2. Select Page details (located at the top-right).
  3. Scroll down to page category.
  4. Pick a value (for instance, blog).
  5. Save your changes (the page will auto-save) or republish if already published.
Go to Site Pages, select the page, click the top-right corner icon to open Page Details, and update the Page Category value.

Step 3: Add the SharePoint News Web Part to Your Page

Navigate to the page where you wish to show filtered News.

  1. Select Edit.
  2. Click on the plus sign.
  3. Search for News.
  4. Integrate the News web part.
  5. Choose your preferred format (tiles, list, carousel, etc.).

Note: If you have already been on your communication site, then skip the few steps.

Step 4: Apply Filters to the News Web Part Based on Page Category

  1. Modify your page.
  2. Highlight the News web part.
  3. Select the Edit web part option (pencil symbol).
Click the Edit icon to update the news property.
  1. Scroll down to Filter.
  2. Pick Page properties.
See the filter and select the “Page Properties” filter option.
  1. In the Property name section, choose: Page category.
For Property Name, select the property named “Page Category.”
  1. In the value input field, type or select: Blog.
Select the values; currently, only “Blog” is selected.
  1. Implement changes.
  2. Publish the page.

Now, the News web part will exclusively show entries labeled as “blogs.

Step 5: Create a Blog Section Utilizing News Filtering (optional)

You can establish a specific blog page by using this filtered News web part.

Example Configuration:

  • Page category = Blog

The news web part was adjusted to display solely blog entries.

Here we get the result: after publishing the page, only “Blog” shows; nothing else appears.

Unique banner, design, and navigation links.

This provides a comprehensive blog experience within SharePoint Online.

Step 6: Scale Across Multiple Sites (Reusing the Page Category Column)

Once you've successfully set up filtered news on one communication site, you'll likely want to implement the same categorization system across other sites in your organization. The good news: you don't need to recreate the Page Category column from scratch each time.

SharePoint allows you to convert your custom column into a reusable Site Column that can be added to any communication site. This ensures consistency in how news is categorized across your entire SharePoint environment and saves significant setup time.

Benefits of Using Site Columns

  • Maintain consistent categorization across all sites
  • Save time by avoiding repetitive column creation
  • Ensure all sites use the same category values
  • Make updates to categories in one place

How to Create and Reuse the Page Category Site Column

Part A: Convert to Site Column (on your original site)

  1. Navigate to Site Settings on your source communication site.
  2. Under Web Designer Galleries, select Site columns.
  3. Click Create.
  4. Enter the column name: Page Category.
  5. Select type: Choice.
  6. Add your choice values (Blog, Announcement, Project News, etc.).
  7. Choose an appropriate group or create a new one (e.g., "Custom News Columns").
  8. Click OK to save.

Part B: Add Site Column to Other Sites

  1. Navigate to the target communication site where you want to use filtering.
  2. Go to Site contents.
  3. Open Site Pages library settings.
  4. Click Add from existing site columns.
  5. Locate and select Page Category from the appropriate group.
  6. Click Add, then OK.

The Page Category column is now available on the new site with all the same options you configured originally. You can immediately begin categorizing news posts and setting up filtered News web parts following Steps 2-4 from this guide.

Pro Tip: If you need to add new category values later, update the Site Column definition, and the changes will be reflected across all sites using that column.

Frequently Asked Questions (FAQs)

Can I filter SharePoint Online News by category?

Yes, you can filter SharePoint Online News by creating a Page Category column in the Site Pages library and applying it as a filter in the News web part.

What is the Page Category column in SharePoint?

The Page Category column is a custom metadata field added to the Site Pages library that helps categorize news posts such as blogs, announcements, and project updates.

Is filtering SharePoint News using Page Properties supported by Microsoft?

Yes, using Page Properties like Page Category for filtering the News web part is fully supported by Microsoft in SharePoint Online.

Can I reuse the Page Category column across multiple SharePoint sites?

Yes, by creating a Site Column, you can reuse the Page Category column across different communication sites for consistent filtering.

Can I create a blog section in SharePoint using the News web part?

Yes, by filtering the News web part with Page Category set to “Blog,” you can create a dedicated blog section within SharePoint Online.

Does filtering affect existing news posts?

No, existing news posts will appear once you assign them a Page Category value.

Conclusion

Filtering SharePoint Online News using the Page Category column is a powerful yet simple way to organize content and deliver targeted updates to users. By leveraging page properties and the News web part’s built-in filtering capabilities, organizations can create structured blog sections, department-specific news areas, and cleaner communication sites.

This approach is fully supported by Microsoft, scalable across multiple sites, and improves content discoverability without requiring custom development.

With proper categorization in place, SharePoint News becomes a more effective communication tool for your organization.

January 30, 2026

Generative AI in Quality Assurance: Automating Modern QA Workflows

Introduction

Quality Assurance has traditionally relied on manual testing, predefined scripts, and lengthy regression cycles. With growing software complexity and faster release timelines, these methods struggle to scale.

Generative Artificial Intelligence (GenAI) is transforming QA by automating test creation, improving defect detection, and optimizing test execution. Real-world AI-powered tools are already driving faster, smarter, and more reliable testing workflows.

What is Generative AI in Quality Assurance and How Does It Work?

Generative AI in QA uses advanced machine learning (AI/ML) models to generate content such as test cases, automation scripts, test data, and even defect analysis insights.

In QA workflows, GenAI enables:

  • Automatic test case generation
  • AI-driven automation creation
  • Intelligent defect prediction
  • Self-healing test scripts
  • Smart regression optimization

How Generative AI is Automating and Transforming Modern QA Workflows?

  1. AI-Driven Test Case Generation Based on Requirements: GenAI analyzes user stories, acceptance criteria, and business flows to automatically generate comprehensive test cases. Tools like ACCELQ, Functionize, and Tricentis Tosca allow teams to convert requirements directly into executable tests, reducing manual effort and improving coverage.
  2. Intelligent Test Script Creation (Low-Code/No-Code): GenAI helps create automation scripts without heavy coding by understanding application behavior. GenAI-powered platforms such as Testim, Mabl and Functionize create low-code and self-healing automation scripts. These tools adapt automatically to UI changes, reducing maintenance while increasing automation stability.
  3. AI-Powered Defect Detection and Root Cause Analysis: AI analyzes logs, failures, and historical defects to predict high-risk areas and find root causes faster. Tools like Functionize and Mabl use AI analytics to detect anomalies, predict failures, and identify root causes. This enables faster issue resolution and proactive quality improvements.
  4. AI-Driven Self-Healing Test Automation: AI updates test scripts automatically when UI elements change, eliminating broken tests. Tools such as Testim automatically adapt to UI changes, Mabl provides self-healing locators with smart waits, and Tricentis Tosca leverages AI-based test object recognition to ensure stable and resilient test automation.
  5. AI-Based Risk-Driven Test Prioritization: GenAI predicts which test cases are most likely to fail based on recent changes and past trends. Platforms like Mabl enable risk-based test execution, Tricentis Tosca applies AI-driven regression optimization, and ACCELQ provides smart execution planning to accelerate and prioritize critical test scenarios.
  6. AI-Powered Test Data Generation: AI creates realistic and compliant synthetic test data. Tools such as Tricentis Data Integrity leverage AI-driven data generation and masking, while GenRocket uses AI-assisted synthetic data creation to produce realistic, compliant test datasets for comprehensive testing.
  7. Conversational AI Assistants: AI chat interfaces assist testers in debugging, reporting, and test analysis. AI-powered assistants help QA engineers understand failures, generate reports, and receive insights through natural language. Solutions like Functionize AI Chat explain test failures and recommend fixes, while AI-powered DevOps bots integrated with Slack and Jira provide real-time insights and automation support across QA workflows.

Business Impact of Generative AI in Quality Assurance

AI-driven QA workflows reduce manual testing effort, stabilize automation, accelerate releases, lower costs, and significantly improve product quality and customer satisfaction.

Challenges and Considerations

Successfully adopting Generative AI in QA requires reliable training data, strong security controls, and human oversight to validate AI outputs. Organizations must also ensure regulatory compliance and carefully integrate AI solutions into their existing testing processes.

Conclusion

Generative AI is revolutionizing QA through real-world platforms like Testim, Mabl, Functionize, Tricentis Tosca, and ACCELQ. By automating testing and introducing intelligence into workflows, organizations can achieve faster delivery and higher quality software.

Frequently Asked Questions

FAQ 1: What is Generative AI in Quality Assurance?

Generative AI in Quality Assurance refers to AI models that automatically create test cases, automation scripts, test data, and defect insights by analyzing requirements, application behavior, and historical testing data.

FAQ 2: How does Generative AI improve software testing?

Generative AI improves software testing by automating test design, enabling self-healing automation, predicting defects, optimizing regression testing, and reducing manual effort across QA workflows.

FAQ 3: Which tools use Generative AI for QA testing?

Popular AI-driven QA tools include Testim, Mabl, Functionize, Tricentis Tosca, ACCELQ, Tricentis Data Integrity, and GenRocket, all of which leverage AI for automation, analytics, and test optimization.

FAQ 4: Can Generative AI replace manual testers?

No, Generative AI enhances QA workflows but does not replace testers. Human expertise is essential for test strategy, validation, business logic understanding, and governance.

FAQ 5: Is AI-driven testing suitable for enterprise applications?

Yes, AI-driven testing is widely adopted in enterprise environments to handle complex systems, large regression suites, and continuous delivery pipelines.

FAQ 6: Is AI-driven testing suitable for enterprise applications?

The future of QA includes autonomous testing pipelines, predictive quality analytics, self-healing automation, and AI-powered continuous testing integrated into DevOps processes.

If you have questions about implementing Generative AI in your QA workflows, connect with our AI Consulting team here.

January 29, 2026

Power Platform ALM Using Native Pipelines: Step-by-Step Dev to Prod Deployment Guide

Power Platform ALM with Pipelines: Step-by-Step Dev to Prod Deployment Guide

Introduction

Application Lifecycle Management (ALM) is critical for building reliable, scalable Power Platform solutions. A proper ALM setup ensures that changes are developed safely, tested thoroughly, and deployed consistently into production.

Microsoft Power Platform Pipelines provide a native CI/CD automation approach to deploy Power Platform solutions across environments while maintaining governance, traceability, and consistency.

This article covers a complete Power Platform ALM implementation using native Power Platform Pipelines.

Below, we'll configure Power Platform Pipelines for a standard Dev → Test → Prod setup and walk through deploying a solution across environments.

Prerequisites

Before starting, make sure you already have:

  1. Three Power Platform environments configured:
    • Development (Sandbox - Unmanaged Solutions)
    • Test (Sandbox - Managed Solutions)
    • Production (Managed Solutions)
  2. All environments use Dataverse
  3. You have Power Platform admin access
  4. You have a sample or real solution in the Dev environment

Before You Begin

This guide assumes that at least one solution already exists in your Development environment for deployment validation.

If not, create a new solution and add one or more Power Platform components such as:

  • A Canvas or Model-driven Power App
  • A Power Automate flow
  • A Copilot agent
  • A Dataverse table

This solution will be used to validate your Dev → Test → Prod deployments using pipelines.

We’ll refer to this as the example solution throughout the guide.

Setting Up the Power Platform Pipelines Host Environment

Power Platform Pipelines require a dedicated host environment where pipeline configurations, deployment stages, and execution are stored and managed.

This is typically a Production-type environment with Dataverse enabled, dedicated to managing pipeline configurations and execution.

Step 1: Create the Host Environment

  1. Go to Power Platform Admin Centerhttps://admin.powerplatform.com
  2. Navigate to Manage → Environments. And click “New”

Use these settings:

  • Name: Power Platform Pipelines Host
  • Managed: No
  • Type: Production
  • Add Dataverse: Yes
  • URL: companyname-pp-host

Once created, wait for provisioning to complete. Once it’s in Ready state, start with Step 2.

Step 2: Install Power Platform Pipelines App

  1. In Admin Center, go to Manage → Dynamics 365 Apps
  2. Find Power Platform Pipelines
  3. Click Install
  4. Select the Host Environment
  5. Install

After installation, you’ll see a model-driven app named “Deployment Pipeline Configuration” in the Power Platform Pipelines Host environment. This is where all pipelines are managed.

Step 3: Grant Permissions to the Existing Service Account

A service account typically holds elevated privileges such as the System Administrator role. Although Power Platform Pipelines can run under a personal user account, using a dedicated service account is a recommended best practice to ensure continuity, improve security, and avoid granting elevated permissions to individual users in target environments.

In this guide, we assume your organization already has a dedicated service account for automation and integrations.

Required Permissions

The service account must have System Administrator access in all environments involved in the pipeline:

  • Development
  • Test
  • Production
  • Pipelines Host environment

How to Assign Roles

In each environment:

  1. Open Power Platform Admin Center
  2. Select the environment and go to “Users -> See all”
  3. Select the service account from the list of users
  4. Assign the System Administrator security role

Repeat this for all environments: Dev, Test, Prod, and Host.

Step 4: Register Environments in the Pipelines App

Open the Deployment Pipeline Configuration app in the host environment.

Register Development Environment

  1. Go to Environments → New

  1. Fill in:
    • Name: ALM (Dev)
    • Type: Development
    • Owner: You
    • Environment ID: Copy from Development Environment Landing Page

  1. Save and wait for validation = Success

Register Target Environments

Repeat the same process for:

Test
  • Name: ALM (Test)
  • Type: Target
Production
  • Name: ALM (Prod)
  • Type: Target

Step 5: Create a Pipeline

Open the Deployment Pipeline Configuration app in the host environment.

  1. Go to Pipelines, Click New

  1. Name: ALM Pipeline
  2. Enable: Allow redeployments of older versions
  3. Save

Link the Development Environment

Add Development Environment as the source environment to the created pipeline.

Add Deployment Stages

Click New Deployment Stage:

Test Stage
  • Name: Deployment to Test
  • Target: Test Environment
Production Stage
  • Name: Deployment to Prod
  • Previous Stage: Test
  • Target: Production Environment

Now, we can see both stages in the Deployment Stages section:

Assign Security Roles

Open Security Teams in the Pipelines app.

Pipeline Admins

Add users who are allowed to configure pipelines. This will allow added users to access the deployment pipeline configuration app, add new pipelines, and edit existing pipelines in the host environment.

  • Navigate to Deployment Pipeline Administrators
  • Click Add existing user

  • Search for the required user and add them
Pipeline Users

Add users who are allowed to run deployments.

  • Navigate to Deployment Pipeline Users
  • Click Add existing user

  • Search for the required user and add them

Step 6: Deploy Power Platform Solution to Test Environment Using Pipelines

As we have created the Power Platform pipeline, we can deploy the solution from the Development environment to the Test (Staging) environment using the pipeline. Once it is successfully validated in the Test (Staging) environment, the solution can then be deployed to the Production environment.

  • Go to Development Environment
  • Open your example solution

  • Click Pipelines

  • Select your pipeline, and click Deploy here (Test/Staging stage)

  • Select Now (or you can select Later to schedule the deployment) and click Next

  • Verify the connections and resolve errors if any

  • Verify the environment variable values and update them as needed

  • Verify the Deployment Notes, modify as needed and click Deploy

  • Wait for a few minutes to have the deployment completed. It appears as shown in the screenshot below when deployment is completed

Verify solution appears as Managed in Test (Staging) Environment

  • Go to Test (Staging) Environment and the deployed solution should appear here

  • Perform functional validation of the solution in the Test (Staging) environment.

Step 7: Deploy Power Platform Solution to Production Environment

Once testing is completed on the staging environment, we can deploy the same solution to production environment using the created pipeline.

  • Go to Development Environment, open your example solution, go to pipelines
  • Select your pipeline, and click Deploy here (Production stage)

  • Then, follow the same steps we followed to deploy to Test (Staging) Environment

 

Verify solution appears as Managed in Production Environment

  • Go to Production Environment and the deployed solution should appear here

  • Perform final validation of the solution in the Production environment.

Conclusion

Implementing Power Platform ALM using native Pipelines simplifies deployment automation, improves governance, and ensures consistent solution delivery across environments. By following a structured Dev → Test → Prod approach, organizations can reduce deployment risks while accelerating release cycles.

Best Practices for Power Platform ALM Using Pipelines

  • Keep Development solutions unmanaged for flexibility
  • Always deploy managed solutions to Test and Production
  • Use service accounts for pipeline execution
  • Maintain environment variables per environment
  • Validate deployments in staging before production release
If you need assistance implementing Power Platform ALM or automating enterprise deployments, feel free to contact our SharePoint & Power Platform consulting team here.

Introducing Heft: The Modern Build Tool Replacing Gulp in SharePoint Framework (SPFx)

Introducing Heft: Modern Build Tool Replacing Gulp in SPFx Development

For a long time, Gulp was the default build tool for SharePoint Framework (SPFx) projects. Developers relied on familiar commands like gulp serve and gulp bundle to compile, package, and deploy their SPFx solutions.

However, as SPFx applications grew in size and complexity, the traditional Gulp-based build system began to struggle with performance, scalability, and long-term maintainability.

To address these challenges, Microsoft introduced Heft - a modern build orchestrator from the Rush Stack ecosystem - and made it the default SPFx build tool starting with SPFx v1.22.

In this article, we’ll explore the differences between SPFx Heft vs Gulp, why Microsoft made the switch, and how Heft improves the modern SharePoint Framework development workflow.

The Gulp Era in SharePoint Framework (SPFx)

In the early days, Gulp handled almost everything in an SPFx project:

  • Compiling TypeScript
  • Bundling with Webpack
  • Running the local dev server
  • Packaging .sppkg files
  • Automating the build pipeline

Typical workflows looked like this:

gulp serve
gulp bundle --ship
gulp package-solution --ship

For small projects, this worked fine. For large, long-living enterprise solutions, it did not.

Why Gulp Started to Fail

1. Slow Builds at Scale: Gulp runs tasks mostly sequentially, lacks smart caching, and often triggers full rebuilds for small changes. Result: Slow feedback loops and reduced productivity.

2. Fragile gulpfile.js: Task chains become complex, hard to debug, and frequently break during SPFx upgrades. Result: Build scripts harder to maintain than the app.

3. Poor Fit for Monorepos & Enterprise: Gulp wasn’t designed for monorepos, sharing build logic was painful, and dependency conflicts were common. Result: Scaling SPFx across teams became difficult.

4. Weak Type Safety & Debugging: Mostly JavaScript-based with unclear errors and poor traceability across tools. Result: Developers spent more time debugging the toolchain than writing features.

Enter Heft: The Modern SPFx Build Tool

Heft is a modern build orchestrator from Microsoft’s Rush Stack team, built to support large, enterprise-scale TypeScript solutions.

Unlike Gulp, which is a general-purpose task runner, Heft understands how modern development tools relate to one another - including TypeScript, ESLint, Jest, and Webpack.

Heft focuses on:

  • Clearly defined build phases
  • Plugin-based architecture
  • Incremental builds and smart caching
  • Parallel execution where possible

SPFx internally uses Heft to handle:

  • Compilation
  • Bundling
  • Linting
  • Testing
  • Packaging

SPFx Workflow Update: With SPFx v1.22, Gulp is replaced by Heft - but the developer experience remains familiar.

Task Command
Dev Server heft start
Production Build heft build --production
Package heft package-solution --production

These commands are mapped to standard npm scripts (npm start, npm run build), so day-to-day development workflows remain unchanged.

SPFx Heft vs Gulp: What Actually Changed?

Feature Gulp Heft
Build approach Scripted tasks Phase-based orchestration
Performance Slower at scale Faster with caching & parallelism
Configuration gulpfile.js JSON-based configs
Type safety Limited Strong
Monorepo support Weak Built-in
Debugging Hard to trace Clear errors & logs

Deployment: What Did NOT Change

The deployment process remains exactly the same:

  • Output is still a .sppkg file
  • Deployment still happens via:
  • SharePoint App Catalog
  • CI/CD pipelines (Azure DevOps, GitHub Actions)

Only the build engine changed - not the deployment process.

Node.js & SPFx Compatibility

  • SPFx v1.21.1+ → Node.js 22 LTS
  • Older SPFx → Node.js 16 / 18
  • SPFx ≤ 1.21 uses the Gulp-based toolchain
  • Heft becomes the default starting from SPFx 1.22

Heft officially replaces Gulp starting with SPFx 1.22 onward.

Why Heft Actually Matters

Moving to Heft brings real, practical benefits:

  • Faster rebuilds
  • Less configuration code
  • Fewer breaking changes
  • Consistent builds across teams

Less time fighting the build system, more time writing features.

Frequently Asked Questions (FAQs)

What is Heft in SharePoint Framework (SPFx)?

Heft is a modern build orchestrator developed by Microsoft’s Rush Stack team. It replaces the traditional Gulp-based build system in SharePoint Framework (SPFx) starting from version 1.22, providing faster builds, better scalability, and improved developer experience.

Why did Microsoft replace Gulp with Heft in SPFx?

Microsoft replaced Gulp with Heft to improve performance, maintainability, and scalability of SPFx projects. While Gulp worked well for smaller solutions, it struggled with large enterprise applications. Heft introduces incremental builds, parallel execution, and modern tooling integration.

Is Gulp still used in SPFx projects?

Yes, older SPFx versions (up to 1.21) still use the Gulp-based build system. Starting from SPFx version 1.22, Heft is the default build tool for all new and updated projects.

Does Heft change the SPFx deployment process?

No. The deployment process remains unchanged. Developers still generate .sppkg files and deploy them through the SharePoint App Catalog or automated CI/CD pipelines such as Azure DevOps and GitHub Actions.

Which Node.js version should be used with Heft in SPFx?

SPFx version 1.21.1 and later support Node.js 22 LTS, while older SPFx versions typically rely on Node.js 16 or 18 depending on compatibility.

Final Thoughts

Gulp served SPFx well in its early days, but modern enterprise needs demanded something better.

Heft is not just a replacement, it’s an upgrade.

The shift from Gulp to Heft reflects Microsoft’s move toward a faster, and more scalable build system for SharePoint Framework projects.

If you have any questions, reach out to our SharePoint Consulting team here.

January 23, 2026

A Complete Step-by-Step Guide to Google Workspace to Microsoft 365 Email Migration


Introduction:

This blog describes an email migration from Google Workspace to Microsoft 365, focused exclusively on transitioning Gmail mailboxes to Exchange Online. The migration involved precise user-to-user mailbox mapping, preservation of primary SMTP addresses, and configuration of email aliases to ensure uninterrupted mail flow across multiple domains.

A phased, domain-based migration approach was used to reduce risk, validate mail delivery, and maintain identity consistency during cutover, providing a practical reference for executing real-world Google Workspace to Microsoft 365 email migrations.


Note: This article focuses on email migration.

To migrate files and folders, refer to this guide for Google Drive to OneDrive for Business (ODFB) migration.

Adding Domain in Microsoft 365 Account

Step 1: Access the Microsoft 365 Portal

Begin by navigating to office.com in your web browser. Sign in using your Global Administrator credentials to access the main dashboard.

Step 2: Launch the Admin Center

From your Microsoft 365 Home dashboard, locate the Admin tile (usually found under "Quick access" or by clicking the app launcher) and click it to enter the management portal.


Step 3: Access Domain Management

Inside the Admin Center, locate the left-hand navigation menu. Click on Settings to expand the list, then select Domains. This is where you will configure and verify the external domain you wish to migrate from Google Workspace.


Step 4: Initiate the Add Domain Wizard

In the Domains management pane, look for the toolbar at the top of the list. Click on the + Add domain button to begin the wizard. This will launch the setup process for linking your existing Google Workspace domain (e.g., yourcompany.com) to your Microsoft 365 tenant.


Step 5: Input Your Domain Name

The Add a domain wizard will now appear. In the designated text field, type the exact name of the domain you are migrating from Google Workspace (e.g., yourcompany.com). Once entered, click the Use this domain button to proceed to the verification stage.


Step 6: Verify Domain Ownership

Microsoft must confirm that you own the domain before proceeding. You will be presented with a few verification options. The most common and secure method is to select Add a TXT record to the domain's DNS records. Select this option and click Verify (or continue) to retrieve the specific record values you need to add to your DNS host.


Step 7:  Defer the DNS Connection

The wizard will now ask how you want to connect your domain. Since this is a migration and you are not ready to switch mail flow (MX records) yet, it is crucial to select Skip and do this later. This prevents any immediate disruption to your current Google Workspace emails. Click Continue to finalize the setup without altering your DNS.


Step 8: Finalize the Domain Setup

You will now see a confirmation screen stating, "Domain setup is complete." This indicates that Microsoft 365 has successfully verified your domain ownership. Since we chose to skip the DNS record updates for now (to keep your current email active), simply click the Done button to close the wizard.

Note: Once you click the Done button, your new domain will appear in the active domains list. It may also be automatically set as your Default domain for new user creation.


Provisioning Users for Microsoft 365 Migration

Step 9: Access User Management

In the Microsoft 365 Admin Center, locate the left-hand navigation pane. Click on Users to expand the menu, then select Active users. This dashboard is where you will create the destination accounts for your migration.


Step 10: Initiate User Creation

On the Active user's page, you will see the management dashboard for your user accounts. To start provisioning a new user, click the Add a user button located in the toolbar at the top of the list.


Step 11: Enter Basic User Details

The Add a user wizard will open to the Basics tab. Here, you must fill in the user's identity information, including First name, Last name, and Display name.


Step 12: Assign Product Licenses

On the Product licenses page, first select the appropriate country from the Location dropdown menu. Next, ensure the Assign user a product license option is selected. Check the box next to the specific license you wish to assign (e.g., Microsoft 365 Business Standard) and click Next.


Step 13: Configure Optional Settings

You will arrive at the Optional settings page. Here, you can assign specific administrative Roles to the user if needed, such as "Global admin" or "User admin". For a standard user, you can leave the default role ("User: no administration access") and click Next to proceed.


Step 14: Review and Finalize

You will now reach the Review and finish page. This is your chance to verify all the details you have configured, including the display name, username, and assigned licenses. If everything looks correct, click the Finish adding button to officially create the user account.


Step 15: Confirmation of User Creation

After a few seconds of processing, a confirmation window will appear stating "User added to active users." This indicates the account has been successfully created. You can view or copy the password details here if needed. Once done, click the Close button to exit the wizard.

Note: You must repeat these steps for every single active user you intend to migrate. Ensure all accounts are created and assigned valid licenses before proceeding to the migration phase.

Once all users are added, remember to set your custom domain as the Default. To do this, navigate to Settings > Domains in the Admin Center, select your new domain, and choose Set as default.


Migrating Google Workspace Mailboxes to Microsoft 365

Step 16: Access the Exchange Admin Center

In the Microsoft 365 Admin Center, locate the left-hand navigation menu. Click on ... Show all to expand the full list of options. Scroll down to the "Admin centers" section and select Exchange. This will launch the modern Exchange Admin Center (EAC) where the migration will be managed.


Step 17: Initiate a New Migration Batch

In the Exchange Admin Center dashboard, locate the Migration option in the left-hand navigation pane. Click it to open the migration management screen. From the top menu bar, select Add migration batch to launch the configuration wizard.


Step 18: Define Batch Name and Path

The Add migration batch wizard will open. First, enter a unique name for your migration (e.g., "GWorkspace_Migration") in the Migration batch name field. Next, locate the Select the mailbox migration path dropdown menu and choose Migration to Exchange Online. Click Next to proceed.


Step 19: Select Migration Type

On the Migration type screen, you will be presented with several options. Select Google Workspace (Gmail) migration from the list to specify the source environment. Click Next to continue to the prerequisites check.


Step 20: Configure Migration Prerequisites

The Prerequisites for Google Workspace migration window will now appear. You must choose how to prepare your Google Workspace environment for the data transfer. You have two options:
  • Automate the configuration: Let Microsoft 365 handle the setup automatically (Recommended).
  • Manually configure: Upload your own JSON key file and configure settings manually (Advanced).
 

Automate the Configuration of Your Google Workspace for Migration

Step 21: Initiate Automated Configuration

Under the Automate the configuration of your Google Workspace for migration section, click on the Start button. This will trigger the automated setup process which handles the necessary permissions and API connections for you.

Important Requirement: Before proceeding, ensure you have the Super Admin credentials for the Google Workspace tenant you are migrating from. The automated configuration tool requires these elevated privileges to establish the necessary connections and permissions.


Step 22: Retrieve Client ID, Scopes, and Private Key

Once the automation process finishes, the wizard will display your client ID and the required API Scopes. A direct link will also be provided to the Google Workspace Admin console, where you must add these scopes to authorize the connection.

Important: A Private Key (JSON file) will automatically download to your computer. Save this file securely, as it serves as the credential key for the migration endpoint.


Step 23: Configure Domain-Wide Delegation

In the Microsoft 365 prerequisites window, locate and click the blue Link next to the text "Click the link to add scopes for API access".

A new browser window will open, taking you to the Google Workspace Admin console. On the "Domain-wide Delegation" page, click the Add new button. You will then use the Client ID and API Scopes that were generated in the previous step to grant the necessary permissions.


Step 24: Authorize the API Connection

In the Add a new client ID pop-up window that appears in the Google Admin Console:
  • Paste the Client ID (copied from the Microsoft 365 wizard) into the Client ID field.
  • Paste the OAuth scopes (also from the wizard) into the OAuth scopes field.
  • Click the Authorize button to save the configuration and grant the necessary permissions.


Manually Configure Google Workspace for Migration

If the automated tool is not an option, or if you prefer full control over the security settings, you can manually configure the necessary Google Cloud components.

Step 25: Create a Google Cloud Project

  • Click on the project dropdown menu in the top navigation bar (often labeled Select a project).
  • In the pop-up window, click New Project.
  • Enter a project name (e.g., "M365-Migration-Project") and click Create.


Step 26: Configure and Create the Project

In the New Project screen that appears:
Enter a descriptive name in the Project name field (e.g., "M365-Migration-Project").
Under Location, browse and select your organization (if applicable) or leave it as "No organization".
Click the Create button to initialize the new project.



Step 27: Select the Created Project

Once the project is successfully created, a notification will appear in the top-right corner of the Google Cloud Console. Click on SELECT PROJECT within this notification to switch your active dashboard to the new project.


Step 28: Create a Service Account

  1. Navigate to the IAM & Admin section in the Google Cloud Console (or visit https://console.cloud.google.com/iam-admin ).
  2. From the left-hand menu, click on Service accounts.
  3. At the top of the main pane, click the + CREATE SERVICE ACCOUNT button.

 


Step 29: Define Service Account Details

In the Create service account form:
  1. Enter a descriptive name in the Service account name field (e.g., m365-migration-svc).
  2. The Service account ID field will populate automatically based on your entry.
  3. Click the CREATE AND CONTINUE button to save and proceed to the next step.

 


Step 30: Grant Access to the Service Account

  1. In the "Grant this service account access to project" section, click the Select a role dropdown menu.
  2. Choose Owner (under "Basic" or by searching for "Owner") to give the service account full access to the project.
  3. Click CONTINUE to apply the role.
  4. Finally, click DONE at the bottom of the page to complete the service account creation.

 


Step 31: Retrieve and Save the Client ID

After creating the service account, you will be redirected to the Service accounts list.
  1. Locate your newly created service account in the table.
  2. Find the OAuth 2 Client ID (a long string of numbers) in the corresponding column.
  3. Copy this Client ID and save it in a secure location (like a Notepad file). You will need this ID later to configure domain-wide delegation in the Google Admin console.

 


Step 32: Generate the Private Key (JSON)

  1. In the Service Accounts list, click on the email address (link) of the service account you just created.
  2. On the service account details page, click the KEYS tab in the top navigation bar.
  3. Click the ADD KEY dropdown button and select Create new key.
  4. In the pop-up window, ensure JSON is selected as the "Key type".
  5. Click CREATE. The JSON file containing your private key will automatically download to your computer.

 


Step 33: Select Key Type and Create

  1. On the service account details page, switch to the KEYS tab.
  2. Click the ADD KEY dropdown and select Create new key.
  3. A pop-up window titled "Create private key" will appear. Select JSON as the Key type.
  4. Click the CREATE button. The file will download immediately.


Step 34: Save the Downloaded Key

After clicking Create, a confirmation window will appear stating that the Private key saved to your computer. The JSON file will automatically download to your browser's default download location.

Crucial: This is the only time you can view or download this specific private key. If you lose it, you will need to generate a new one.


Step 35: Enable Required Google Workspace APIs

To allow the migration tool to access and move your data, you must manually enable three specific APIs in your project.
1. Navigate to the API Library:
2. Enable the following three APIs: Repeat the search and enable process for each of these services:
  • Gmail API
  • Google Calendar API
  • Google People API
  • Google Contacts API

Note: If an API is already enabled, the button will say "Manage" instead of "Enable". You can simply move to the next one.


Step 36: Add Domain-Wide Delegation

  1. Open a new tab and log in to the Google Admin Console (admin.google.com).
  2. Navigate to the Domain-wide Delegation page. You can use this direct link: https://admin.google.com/ac/owl/domainwidedelegation . (Alternatively, go to Security > Access and data control > API controls > Manage Domain Wide Delegation).
  3. Click the Add new button to register your service account.



Step 37: Enter Client ID and Authorize Scopes

In the Add a new client ID popup window that appears:
  1. Client ID: Paste the long numeric string you saved earlier (from Step 7).
  2. OAuth scopes: Copy and paste the exact list of scopes below into this field: https://mail.google.com,https://www.googleapis.com/auth/contacts,https://www.googleapis.com/auth/calendar,https://www.googleapis.com/auth/gmail.settings.sharing,https://www.google.com/m8/feeds
  3. Click the Authorize button.

 


You have now successfully completed all the required configurations in the Google Cloud Console and Google Workspace Admin Console (Service Account, APIs, and Domain-Wide Delegation).
  • Leave the Google tabs open (just in case you need to copy values again).
  • Switch your browser tab back to the Microsoft 365 Exchange Admin Center.
  • You should still be on the Add migration batch wizard where you left off.

 

Step 38: Complete Prerequisites Check

Back in the Microsoft 365 Exchange Admin Center:
  1. Verify that you are on the Prerequisites for Google Workspace migration page.
  2. Since you have manually completed all the listed tasks (Service Account creation, API enablement, and Domain-wide delegation) in the previous steps, you can now proceed.
  3. Click the Next button at the bottom of the screen.


Step 39: Create a New Migration Endpoint

  1. The Set a migration endpoint window will now appear.
  2. Select the option: Create a new migration endpoint.
  3. Click Next.


Step 40: Configure Endpoint General Information

  1. On the General information page, locate the Migration endpoint name field.
  2. Type a unique name for this endpoint (e.g., GWorkspace_Endpoint).
  3. Leave the Max concurrent migrations and Max concurrent incremental syncs fields at their default values (unless you have specific reasons to change them).
  4. Click Next.


Step 41: Configure Google Workspace Connection

1. On the Google Workspace configuration page:
  • Email address: Enter the email address of the Google Workspace Super Admin (this must be the account used to create the Service Account).
  • JSON key: Click the button to upload (often labeled Choose File or Import) and select the .json file you downloaded to your computer earlier.
2. Once the file is uploaded and the email is entered, click Next.


Step 42: Confirm Endpoint Creation

  1. You should now see a confirmation or the list of endpoints showing your new Migration Endpoint has been created successfully.
  2. Click Next to proceed to the user selection stage.


Step 43: Create the User Migration CSV File

You need to create a simple CSV file that tells Microsoft 365 which users to migrate. Since we are using the Service Account (API) method, you do not need user passwords.
  1. Open Excel or a plain text editor (like Notepad).
  2. In the first row (A1), enter the exact header: Email Address (Note: Ensure there are no spaces in the header).
  3. In the rows below, list the Microsoft 365 email addresses for every user you want to migrate in this batch.
  4. Save the file as a .csv (Comma Separated Values) file (e.g., migration_users.csv).


Step 44: Import the User List (CSV)

  1. On the Add user details page of the wizard, select the option to Manually upload a CSV file.
  2. Click the Choose File (or "Browse") button.
  3. Select the .csv file you just created (e.g., migration_users.csv).
  4. Once the file is uploaded and validated, click Next to proceed.


Step 45: Configure Migration Settings

1. On the Move configuration window:
  • Target delivery domain: Enter your Microsoft 365 routing domain (typically yourcompany.onmicrosoft.com). This ensures email routing works correctly during the migration.
  • Select items to migrate: Check the boxes for the data types you want to move:
  1. Mail
  2. Calendar
  3. Contacts (You can also select Rules if available/needed).
2. Click Next to proceed.


Step 46: Schedule and Start the Migration Batch

1. On the Schedule batch migration page, configure the final settings:
  • Send a report to: By default, your admin email is selected. You can add other recipients who should receive the final status report.
  • Start the migration batch: Select Automatically start the batch (or "Automatically processing the batch").
  • End the migration batch: Select Automatically complete the migration batch.
  • Time zone: Select your local time zone from the dropdown menu to ensure reports and schedules align with your time.
2. Click the Save button to finalize the wizard and begin the migration process.


Step 47: Finalize Batch Creation

  1. Wait for Processing: The system will take a moment to process your request.
  2. Confirmation: You will see a status message indicating Batch creation successful.
  3. Action: Click the Done button to close the wizard.
Note: After clicking Done, you will be returned to the migration dashboard where you can monitor the progress of your new batch.


Step 48: Monitor Migration Progress

  1. After clicking "Done," you will be redirected to the main Migration dashboard.
  2. Locate your batch in the list. You will see the Status column change to Syncing.
  3. Wait for Completion: The migration process will now copy data from Google Workspace to Microsoft 365. This can take significant time depending on the size of the mailboxes. You do not need to keep the window open; the process runs in the background.


Step 49: Verify Migration Completion

  1. Wait for Data Transfer: As mentioned, this process will take time depending on the size of the mailboxes (ranging from minutes for test accounts to hours or days for large organizations).
  2. Refresh Status: Periodically click the Refresh button (circular arrow icon) in the toolbar to update the list.
  3. Confirm Completion: Once the data transfer is finished, the Status column will change from "Syncing" to Synced or Completed (depending on the batch finalization settings you chose).


We have successfully completed the full migration from Google Workspace to Microsoft 365. I hope this article has helped you understand how to perform a seamless Google Workspace migration using the automated batch method.

Important Post-Migration Reminder: Now that your data has been migrated, the final step is to update your MX Records in your DNS settings to point to Microsoft 365. This is critical to ensure that all new incoming emails are delivered directly to your new Microsoft 365 mailboxes instead of Google Workspace.

Update MX Records to Route Mail to Microsoft 365

While your old emails have been migrated, new emails may still be going to Google Workspace. To switch the flow of new emails to Microsoft 365, you must update your domain's MX (Mail Exchange) records.

To finalize the mail flow, you need to update your DNS records. Microsoft 365 provides a wizard to help you with this.
  1. Open the Microsoft 365 Admin Center.
  2. In the left-hand navigation pane, go to Settings > Domains.
  3. Click on your specific domain name (e.g., yourcompany.com) to open its details.
  4. Click on the Manage DNS (or Continue setup) button to view the required records.


Choose DNS Connection Method

  • After clicking "Manage DNS," you will see the Connection options (or "How do you want to connect your domain?") page.
  • Select the radio button for Add your own DNS records.
Note: This option allows you to manually copy the required MX, SPF, and CNAME records and paste them into your domain registrar (e.g., GoDaddy, Cloudflare, Namecheap). This is often safer than allowing Microsoft to access your DNS settings automatically.
  • Click Continue.

Update DNS Records

  • The wizard will now display a list of the specific records you need to add to your domain registrar (MX, CNAME, and TXT/SPF).
  • Action: Log in to your domain registrar's website (e.g., GoDaddy, Namecheap) and create the new records exactly as shown on the screen.
  • MX Record: Directs email to Microsoft 365.
  • CNAME (Autodiscover): Helps Outlook and mobile apps find the server automatically.
  • TXT (SPF): Authorizes Microsoft 365 to send email on your behalf.
  • Once you have added all the values in your registrar, return to this window and click Continue.

Conclusion

Mission Accomplished!

By following this step-by-step procedure, you have successfully migrated your organization from Google Workspace to Microsoft 365. You’ve handled everything from setting up the Google Service Account to configuring the final DNS records.

Your users can now log in to their new Microsoft 365 accounts with all their historical emails, contacts, and calendars ready to go.