May 15, 2025

How Googlebot Crawls Websites: Understanding Robots.txt, Sitemaps, and SEO Flow

Introduction

Have you ever wondered how your website shows up on Google? It all starts with a process called crawling, where Google’s robot (called Googlebot) visits your website, reads your content, and decides where it should appear in search results.

In this blog, we’ll explore how Googlebot works, what the robots.txt file and sitemap.xml do, and how all of this connects to SEO and getting your site to the first page of Google.


What is Googlebot?

Googlebot is a special program (or bot) created by Google. Its job is to “crawl” the web — visiting websites, reading pages, and collecting data to store in Google’s database.

It works like this:

  • It finds new pages by following links.
  • It checks old pages for updates.
  • It sends this data back to Google, which then decides how to rank your pages in search results.

What is Robots.txt?

The robots.txt file is a simple text file placed in the root folder of your website. It tells search engine bots which pages they are allowed or not allowed to visit.

Example of a robots.txt file:

  • User-agent: * means all bots (Google, Bing, etc.)
  • Disallow: /admin/ means bots should not visit this folder
  • Allow: / means bots can crawl the entire website

This file is useful if you want to hide private or sensitive pages from search engines.


What is Sitemap.xml?

A sitemap is an XML file that lists all the important pages on your website. It helps search engines find and understand your website structure faster. A sitemap might look like this:

You can submit your sitemap to Google through Google Search Console. This helps make sure all your pages are found, especially new or deep-linked pages that aren’t easily accessible.


How Googlebot Crawls and Indexes Your Site – The Flow

Here’s a simple flow of how it works:

  1. Googlebot checks your robots.txt file.
  2. It follows allowed links and starts crawling your pages.
  3. It reads your sitemap.xml (if provided) to discover more pages.
  4. Content from your pages is stored in Google’s index.
  5. Based on quality, keywords, and links, Google ranks your pages in search results.

  6. How SEO Helps You Rank on Google

    Crawling and indexing are only part of the process. The next step is ranking, which depends on SEO (Search Engine Optimization).

    Here are the main parts of SEO:

    1. On-Page SEO
      • Use proper keywords in titles, headings, and content.
      • Add meta descriptions.
      • Use clean URLs, fast loading pages, and mobile-friendly design.
    2. Off-Page SEO
      • Get backlinks (other websites linking to you).
      • Share content on social media.
      • Build trust and authority.
    3. Technical SEO
      • Use a valid robots.txt file.
      • Keep your sitemap updated.
      • Fix broken links and duplicate content.
      • Use schema markup (structured data).

    Common Mistakes to Avoid

    • Accidentally blocking Googlebot in robots.txt.
    • Forgetting to submit a sitemap.
    • Slow page speed or not mobile-friendly.
    • Using duplicate or thin content.
    • Ignoring crawl errors in Google Search Console.

    Best Practices for SEO & Googlebot Crawling

    • Create and submit a sitemap regularly.
    • Always check your robots.txt file after changes.
    • Use internal linking wisely.
    • Monitor your site’s health in Google Search Console.
    • Focus on quality content that helps users.

    Conclusion

    Understanding how Googlebot crawls your website, and how robots.txt and sitemap.xml work, gives you better control over how your content appears on Google. Combine this with solid SEO, and your website will be on the right path to reach the top of search results.

    If you have any questions you can reach out our SharePoint Consulting team here.

PDF Text Not Displaying on macOS (Adobe Acrobat Pro)? Font Compatibility & Fix

Issue Summary:

When generating PDFs and opening them via Adobe Acrobat Pro on macOS, certain text content was not displaying properly. This issue was traced back to unsupported or improperly embedded web fonts, particularly WOFF/WOFF2, which are not supported for embedding in PDFs on macOS.

Key Findings & Root Cause:

  • macOS does not embed WOFF/WOFF2 in PDFs.

  • Adobe Acrobat Pro has inconsistent rendering with web fonts.

  • Some fonts like "Exo 2" from Google Fonts fail to render unless properly converted and embedded.


macOS-Compatible Font Formats:

Font Format Extension Supported Embeds in PDF
TrueType .ttf
OpenType .otf
AAT / DFTT .dfont
WOFF/WOFF2 .woff / .woff2 ⚠️ Web only

System Fonts Pre-Installed in macOS:

  • Sans-serif: Helvetica, Arial, SF Pro

  • Serif: Times New Roman, Georgia, Palatino

  • Monospace: Courier New, Menlo


Solutions & Best Practices:

  1. Use widely supported fonts: Arial, Verdana, Helvetica, SF Pro.

  2. Convert WOFF/WOFF2 fonts to .ttf or .otf formats before PDF generation.

  3. Use @font-face with local font files:

     @font-face {  
      font-family: 'Exo 2';  
      src: url('/fonts/Exo2-Regular.ttf') format('truetype');  
     }  
    
  4. In Puppeteer: Set embedFonts: true, ensure fonts are loaded before page.pdf() is called.

  5. For critical compatibility, prefer Arial or Verdana in TrueType format.


Resolution Achieved:

  • Switching to Verdana in TrueType format (.ttf) and ensuring it was properly embedded in the PDF resolved the issue across macOS devices and Adobe Acrobat Pro.

  • Paging layout issues with Verdana were noted but considered minor and fixable.


Recommendation:

Stick to TrueType system fonts for maximum compatibility in PDF generation for macOS users. Avoid relying on WOFF/WOFF2 or Type 1 fonts, and always test in native apps like Adobe Acrobat Pro post-generation.

If you have any questions you can reach out our SharePoint Consulting team here.

A Comprehensive Guide to Microsoft Planner: Features, Benefits, Challenges, and Tips for Effective Use

Microsoft Planner is one of the task management tools included with Microsoft 365. It enables teams to create, assign, and organize tasks while fostering collaboration and providing clear visibility into project progress. With a visual, user-friendly interface, Planner offers a kanban-style experience similar to Trello or Asana but stands out for its deep integration with other Microsoft 365 apps like Teams, Outlook, and SharePoint. 

What is Microsoft Planner? 

Microsoft Planner is a visual task management application that tracks and arranges work using a kanban-style board. It allows users to: 

  • Create task boards (called plans), 
  • Organize tasks into buckets (categories), 
  • Assign tasks to team members, 
  • Set due dates and task priorities, 
  • Monitor progress through status updates (Not Started, In Progress, Completed). 

While it's not a full-scale project management solution like Microsoft Project, Planner is ideal for lightweight, team-based task coordination. It’s seamlessly integrated with Microsoft 365 Groups, making it an excellent fit for teams already working within Outlook, Teams, or SharePoint. 


Why & When to Use Microsoft Planner 

Why use it: 

  • To visually organize team tasks and projects. 
  • To assign responsibilities and monitor accountability. 
  • To collaborate on shared tasks with comments and file attachments. 
  • To track progress through visual indicators and charts. 
  • To manage small-to-medium projects such as: Marketing campaigns, Employee onboarding, IT system upgrades.

When to use it: 

  • When managing team-based projects, collaboration and transparency are essential. 
  • When you need a simple, visual tool for tracking deadlines and progress. 
  • When using Outlook, Microsoft Teams, or SharePoint in together with task tracking. 
  • When full-fledged project management software is unnecessary or overly complex. 

Benefits of Microsoft Planner: 

Benefit 

Description 

Smooth Integration 

Straightforward integration with To Do, SharePoint, Outlook, Microsoft Teams. 

Visual Task Boards 

Drag-and-drop interface with a kanban board layout 

Notifications
& Reminders
 

Alerts through email and Teams for due dates and task assignments 

Team Collaboration 

Tasks are shareable, comment-enabled, and can be updated by multiple users 

Progress Tracking 

Built-in charts give a quick overview of status and bottlenecks 

File Attachments 

Attach images, documents, or links directly to tasks 

Ease of Use 

Intuitive layout with minimal learning curve 

Flexible 
Task Assignment 

Assign tasks to one or more users, including external collaborators 

Mobile & Web Access 

Accessible via browsers and mobile apps, ideal for hybrid teams 

Enterprise Security 

Inherits Microsoft 365’s enterprise-level security protocols 


Pros: 

  • Integrated seamlessly with Microsoft 365 apps. 
  • Simple, drag-and-drop interface. 
  • Minimal training required for onboarding. 
  • Real-time collaboration and file sharing. 
  • Visual progress tracking (charts, boards, calendar view). 
  • Built-in notifications and reminders. 
  • Accessible via Microsoft Teams. 
  • No extra cost for Microsoft 365 subscribers. 
  • Allows multiple assignees per task. 
  • Works on both web and mobile platforms. 

Cons: 

  • Requires a Microsoft 365 business subscription (not available for personal users). 
  • No Gantt chart or task dependencies like advanced PM(Project Management) tools. 
  • No dedicated desktop application. 
  • Limited customization for fields or workflows. 
  • Not suitable for managing large-scale, complex projects. 
  • Some terminology (like “buckets”) may be confusing for new users. 
  • Mobile app is functional but less capable than the web version. 

Tips for Using Microsoft Planner Effectively: 

  • Use Buckets Strategically – Define buckets based on task stages, priorities, or departments for clarity. 
  • Switch Views Often – Use Board, Charts, and Schedule views to get varied insights into progress. 
  • Add Task Details – Use checklists, due dates, attachments, and labels to keep tasks informative and actionable. 
  • Integrate with Teams – Add Planner as a tab in Teams to enable centralized collaboration. 
  • Monitor Charts View – Regularly check for overdue or blocked tasks to redistribute workloads. 
  • Automate with Power Automate – Create flows to notify users or schedule events when tasks are updated. 
  • Assign Due Dates – Ensure tasks trigger reminders and appear in schedules. 
  • Group by Different Views – Use groupings such as “Assigned To” or “Due Date” to analyze task distribution. 
  • Use Copy Plan – Reuse structure and task templates for recurring projects. 
  • Archive Completed Plans – Keep your workspace clean and organized. 
  • Use Templates – Start projects with predefined templates to save time and ensure consistency. 

Final Thoughts: 

Microsoft Planner is an intuitive and collaborative tool best suited for small to medium-sized teams already using Microsoft 365. It excels at visual task tracking, lightweight project planning, and seamless integration across the Microsoft ecosystem. While it lacks the sophistication of full project management tools, it is perfect for teams looking for simplicity, flexibility, and collaboration. 

If you have any questions you can reach out our SharePoint Consulting team here.

May 8, 2025

UX Design Process Guide: How to Build Applications That Enhance User Experience and Drive Business Success

Introduction:

In today’s competitive digital landscape, developing applications that are not only functional but also user-centered is critical for success. A well-designed user experience (UX) ensures that every interaction feels intuitive, purposeful, and valuable—ultimately driving user satisfaction and business growth.

This blog outlines a 10-step UX design process that bridges the gap between business goals and user needs. Whether you’re a designer, product manager, or stakeholder, following these structured steps will help you build products that solve real problems, stand out in the market, and deliver measurable impact.



Essential Steps of a Successful UX Design Process:

1. Understand the Business & User Goals

Action: Conduct stakeholder interviews, user research, and analyse business objectives.
Tip: Use tools like empathy maps or Lean UX canvas to align user and business needs from the start.


2. Conduct User Research

Action: Perform qualitative (interviews, observations) and quantitative (surveys, analytics) research.
Tip: Focus on identifying pain points, behaviours, and motivations. Keep user personas grounded in real data.


3. Analyze Competitors & Market Trends

Action: Perform a UX competitive analysis and heuristic evaluation of similar products.
Tip: Look for patterns in interaction design, usability, and visual elements. Don’t copy—improve.


4. Define User Personas & Journeys

Action: Build user personas and customer journey maps to visualize the experience.
Tip: Highlight emotions, barriers, and opportunities at each journey stage to guide design decisions.


5. Set Clear UX Goals & KPIs (Key performance indicators)

Action: Define measurable success metrics—task success rate, error rate, time-on-task, etc.
Tip: Align KPIs (Key performance indicators) with business goals and user needs to validate the design’s impact later.


6. Create Information Architecture (IA)

Action: Structure content and features logically using sitemaps, user flows, and task flows.
Tip: Use card sorting with real users to inform your IA decisions.


7. Wireframe Key Screens

Action: Sketch or digitally create low-fidelity wireframes to visualize layout and flow.
Tip: Focus on usability and content hierarchy. Use grayscale to avoid early distractions with UI elements.


8. Design High-Fidelity UI

Action: Develop a consistent visual language using UI kits or design systems (like Material or Fluent).
Tip: Prioritize accessibility (contrast, font size, spacing) and visual hierarchy. Use real content when possible.


9. Prototype & Conduct Usability Testing

Action: Create interactive prototypes with tools like Figma or Adobe XD, and test with real users.
Tip: Observe, don’t explain. Let users try tasks independently. Use think-aloud protocol for richer insights.


10. Iterate & Handoff to Development

Action: Refine designs based on feedback and collaborate closely with devs for smooth implementation.
Tip: Use design tokens, spec-ready files, and maintain version control. Keep communication open.

Expert Bonus Tips:

- Consistency is king. Use a design system to stay scalable and efficient.

- Test early, test often. Don’t wait for perfection to validate your ideas.

- Design for edge cases. Consider loading states, errors, and empty screens.

- Document everything. Good documentation reduces friction across teams.

- Stay user-first. Business goals matter, but user experience drives loyalty.


Conclusion:

Great UX doesn’t happen by accident—it’s the result of deliberate research, thoughtful design, continuous testing, and close collaboration across teams. By applying the steps outlined here, we can create applications that not only meet business objectives but also genuinely delight users.

 If you have any questions you can reach out our SharePoint Consulting team here.

Secure Next.js + React Apps: Prevent Injection Attacks with Proper Input Validation Using Yup and DOMPurify

In modern web applications, improper input validation remains one of the most critical and commonly exploited vulnerabilities. This issue arises when user-provided input is not properly validated, sanitized, or filtered before being processed by the application. Improper input validation is one of the most persistent security flaws, often opening doors to client-side and server-side attacks if left unaddressed.


Context and Impact

Improper input validation can lead to significant security risks depending on how user data is processed and stored. If user input is accepted without restrictions, an attacker may inject malicious scripts or unexpected payloads that can exploit the system. These attacks can lead to:

  • Injection Attacks: Including SQL injection, command injection, or script injection.

  • Authentication Bypass: Gaining unauthorized access by manipulating form inputs.

  • Buffer Overflows: Causing memory corruption by supplying unusually large inputs.

  • File Manipulation: Overwriting or accessing sensitive files through poorly validated upload/download mechanisms.

  • Denial of Service (DoS): Causing performance degradation or application crashes through malformed or excessive input.


Best Practices and Remediation (Specific to Next.js + React with Yup)

In a Next.js React application that uses Yup for client-side validation, the following best practices are recommended to mitigate improper input validation risks:

  • Client-Side Validation with Yup: Use Yup schemas to strictly validate form input fields. Ensure that inputs conform to the expected types, formats, and character lengths. For example, email fields should be validated using .email(), numbers can be range-checked using .min()/.max(), and string patterns can be enforced via .matches().
  • Input Sanitization with DOMPurify in Yup: For inputs such as comments, names, or descriptions that accept free text, integrate DOMPurify into Yup’s .test() method to sanitize the content. DOMPurify helps remove or neutralize potentially harmful content such as:

    • Inline <script> tags

    • HTML event handlers (e.g., onerroronclick)

    • Embedded JavaScript URLs (e.g., javascript:alert('XSS'))

    • <iframe><object>, and other suspicious HTML elements

    • Malformed or encoded HTML intended for injection

    Example Implementation (React Hook Form with Yup + DOMPurify)

    Below is a sample implementation that demonstrates how to combine DOMPurify with custom forbidden pattern checks in a Yup schema.

  1. Validation Schema Setup (Yup + DOMPurify)

  2. Using in a React Component (with React Hook Form + Yup)

This ensures only plain text or safe content is submitted, reducing the attack surface for cross-site scripting (XSS) and other injection vectors.
  • Server-Side Validation: Reinforce all client-side checks with server-side validation. Inputs must be re-validated on the backend to ensure no bypass has occurred. Additionally, server-side sanitization should mirror or complement the client-side logic to maintain consistency and security.

By combining schema-based validation using Yup with content sanitization via DOMPurify, the application now effectively filters out potentially malicious input types that could have otherwise led to XSS, layout injection, or DOM manipulation attacks.


Conclusion

Improper input validation is a critical issue that should not be underestimated. In applications built with Next.js and React, adopting a layered approach to input validation—using Yup for schema validation and DOMPurify for sanitization—provides robust defense against injection attacks and other user input-related vulnerabilities. These best practices help secure application workflows and uphold user data integrity.

 If you have any questions you can reach out our SharePoint Consulting team here.

How to Use Postman for API Performance Testing: Best Practices and Tools

Introduction

Performance testing is essential for ensuring that your application can handle varying levels of traffic without slowing down or crashing. For APIs, this means assessing how they perform under different loads, checking response times, and testing scalability. While tools like JMeter or LoadRunner are often used for intensive load testing, Postman offers a versatile and user-friendly environment for performance testing smaller to medium-scale systems.

In this post, we'll explore how to optimize Postman for performance testing, from creating the right test scenarios to analyzing performance metrics. We’ll discuss practical techniques to ensure your APIs perform well, even when they face heavy usage.


Why Choose Postman for Performance Testing?

Postman, primarily known for functional API testing, also provides some unique benefits for performance testing. Here are a few reasons why Postman is a great option for performance testing:

  1. User-Friendly Interface: Postman’s intuitive interface makes it easy to create, manage, and execute API requests without needing advanced knowledge of performance testing tools.
  2. Easy to use UI for writing and managing API requests
  3. Flexibility and Customization: You can script complex tests using JavaScript, adjust requests with dynamic data, and simulate a wide variety of API interactions.
  4. Collection Runner for automating multiple requests. Scripting support (Pre-request and Tests tab) for customizing logic
  5. Integration with Continuous Testing: Postman works well in CI/CD pipelines, allowing you to automate performance tests as part of your regular workflow.
  6. Integration with Newman, Postman’s CLI, for running tests in bulk. 
  7. Environment and data variables to simulate multiple scenarios

While Postman isn’t designed to handle massive-scale load tests, it is an excellent choice for testing real-world API behavior under moderate traffic.


How to Optimize Performance Testing with Postman

To get the most out of Postman for performance testing, follow these best practices and strategies:

1. Create and Structure Your API Requests

Start by designing your API requests, focusing on the most critical endpoints that receive high traffic. For instance, you might test the performance of the login or data retrieval endpoints. Here's how to begin:

  • Choose Key API Endpoints: Focus on the endpoints that are most important for the functionality of your application, as these are most likely to be under load in real-world scenarios.
  • Structure Your Requests: Set up requests for each endpoint in Postman, including HTTP methods, headers, and parameters. For example, to test a user profile endpoint, you may create a GET request like:
     GET https://api.example.com/users/67890  
    

    1. Create a Test Collection
    2. Begin by grouping the relevant API requests into a collection. This helps in organizing your tests and running them sequentially or in parallel.

    3. Use Environment Variables
    4. Environment variables like {{baseUrl}}, {{token}}, and {{userId}} make your tests more dynamic and reusable.

       {{baseUrl}} = api.example.com  
       {{userID}}= 67890  
       GET https://{{baseUrl}}/users/{{userId}}  
      

  • Add Test Scripts for Performance: To track performance, write simple test scripts in Postman to check response time. 
  • Postman provides built-in response time metrics (pm.response.responseTime), which can be used to track performance.
  • For example, here’s a script that tests whether the response time is under 500 milliseconds:
  •  pm.test("Response time is below 500ms", function () {  
           pm.response.to.have.responseTime.below(500);  
            });  
    

2. Use the Postman Collection Runner for Performance Testing

The Collection Runner is a key feature that allows you to automate performance testing by running multiple requests at once. To optimize your testing:

  • Use Data Files for Variety: Import CSV or JSON files containing test data like user IDs, query parameters, or payloads. This way, you can simulate different real-world scenarios by feeding various data into your requests. For example, create a CSV file with user IDs:
     userId - column name  
     12345 - value  
     67890 - value  
     11223 - value  
    

  • Run the Collection Multiple Times: The Collection Runner allows you to run requests with different sets of data, helping simulate multiple API calls in a short time to measure the system’s performance under varying conditions.

3. Monitor Performance with Postman Monitors

Postman Monitors allow you to run collections at scheduled intervals. This is especially helpful for tracking the performance of your APIs over time. Here’s how to optimize your monitoring:

  • Set Up a Monitor: Once you’ve created your collection, use the monitor feature to schedule tests at regular intervals, such as every 5 minutes or once an hour.
  • Configure Alerts: You can set up alerts that notify you if certain thresholds are exceeded. For example, if the response time exceeds a certain limit, Postman can send an email notification.

4. Analyze Your Results and Metrics

Once you’ve executed your tests, you need to analyze the data to spot any performance issues. Postman provides some basic metrics like response times, but there are other ways to dive deeper into the data:

  • Postman Console: Use the Postman Console to review detailed logs of each request and response. This will include information like response time, status code, headers, and payload size.

To open the console:

  1. Go to the "View" menu in Postman and select “Show Postman Console.
  2. Run your collection and observe the console logs for performance insights.
  • Log Performance Data: You can extend Postman’s built-in functionality by using custom JavaScript in your tests to log additional performance data, such as response time:
      pm.test("Log response time", function () {  
              console.log("Response time: " + pm.response.responseTime + "ms");  
              });  
    

    This will give you a more detailed overview of how each request is performing.


5. Running with Newman for Load Simulation (Optional)

While Postman is great for smaller tests, for larger-scale performance testing, you can use Newman, the command-line version of Postman. Newman allows you to execute collections in a more automated and scalable way, and you can run tests with larger data sets and higher concurrency.

Install Newman, Postman’s CLI companion:

 npm install -g newman  

For example, you can run your collection with multiple parallel iterations to simulate load:

 newman run your-collection.json --iteration-count 100  

This command will execute the collection with 100 concurrent iterations, mimicking multiple simultaneous requests to the API.


6. Combine Postman with Other Performance Tools

While Postman is powerful, it is not designed to handle extremely high traffic. If you need to push your testing to the limits, consider integrating Postman with dedicated load-testing tools like JMeter or Gatling.

You can export your Postman collections and run the same tests on more specialized tools to get detailed performance insights under heavy load conditions.


7. Best Practices for Enhancing API Performance Testing

To get the best results from your performance testing efforts, follow these best practices:

  • Focus on Key Metrics: Don’t overload your tests with too many assertions. Focus on important performance metrics like response time, status codes, and payload size.
  • Simulate Real Traffic: Ensure your test data and load patterns reflect real-world usage. Use realistic numbers for requests per second and data input.
  • Automate and Monitor: Set up automated tests and regular monitoring to continually assess the performance of your APIs over time.
  • Refine Based on Results: Based on your test results, tweak your system, optimize APIs, and adjust the test scenarios accordingly.

Final Thoughts

Postman’s flexibility makes it a powerful addition to your performance testing toolbox, especially when used early in the API lifecycle. By scripting tests, running iterations, and integrating with Newman, you can catch performance bottlenecks before they escalate into production issues.

 If you have any questions you can reach out our SharePoint Consulting team here.

May 1, 2025

How to Call Dataverse API from Power Pages Using safeAjax JavaScript

Introduction:

Power Pages offers a flexible way to expose Dataverse data to users, but calling the Dataverse Web API directly from JavaScript inside a Power Pages app requires some careful configuration. In this blog, we’ll walk through the required setup steps in Power Pages and demonstrate how to make secure GET and POST calls using a custom safeAjax wrapper.

Whether you're working with authenticated or anonymous users, this guide will help you get started with Dataverse API calls using jQuery.


Step-by-Step Configuration to Enable Dataverse API in Power Pages

Step 1: Configure Site Settings for the Target Table

Navigate to Power Pages Management > Site Settings and add the following entries for your Dataverse table. These settings enable API access, field-level control, and CORS.

  1. Enable Web API for the Table

    • Name: Webapi/{Table Logical Name}/enabled

    • Value: true

  2. Define Fields to Be Accessed

    • Name: Webapi/{Table Logical Name}/fields

    • Value: * (all fields)

  3. Allow Cross-Origin Requests

    • Name: Webapi/{Table Logical Name}/AllowedOrigins

    • Value: * (or specify your domain)

  4. (Optional) Enable Anonymous Access

    • Name: Webapi/{Table Logical Name}/AllowAnonymousAccess

    • Value: true

    • Only required if your Power Pages site supports anonymous users

Make sure to select the correct Website and set Source as Table in all entries.


Step 2: Set Up Table Permissions

Next, go to the Security > Table Permissions section in Power Pages Management.

  • Create a new permission for your table.

  • Assign appropriate permissions (Read, Create, Update, etc.) based on your use case.

  • Link this permission to the appropriate Web Roles (Authenticated or Anonymous Users).

Without table permissions, the API call will fail even if site settings are correct.



safeAjax JavaScript Wrapper for Dataverse API Calls

Once your configuration is complete, use the following script to wrap your AJAX calls securely. This script ensures tokens are included for request validation and handles common error scenarios gracefully.

$(function () {
    // Web API ajax wrapper
    (function (webapi, $) {
        function safeAjax(ajaxOptions) {
            var deferredAjax = $.Deferred();
            shell.getTokenDeferred().done(function (token) {
                // Add headers for ajax
                if (!ajaxOptions.headers) {
                    $.extend(ajaxOptions, {
                        headers: {
                            "__RequestVerificationToken": token
                        }
                    });
                } else {
                    ajaxOptions.headers["__RequestVerificationToken"] = token;
                }
                $.ajax(ajaxOptions)
                    .done(function (data, textStatus, jqXHR) {
                        validateLoginSession(data, textStatus, jqXHR, deferredAjax.resolve);
                    }).fail(deferredAjax.reject); // ajax fail
            }).fail(function () {
                deferredAjax.rejectWith(this, arguments); // On token failure
            });
            return deferredAjax.promise();
        }
        webapi.safeAjax = safeAjax;
    })(window.webapi = window.webapi || {}, jQuery)
});

function appAjax(ajaxOptions) {
    return webapi.safeAjax(ajaxOptions).done(function (res) {
        if (res.value.length > 0) {
            const result = res.value[0];
            alert("Successfully logged in!");
        } else {
            alert("Incorrect ID or Password.");
        }
    })
    .fail(function (response) {
        if (response.responseJSON) {
            alert("Error: " + response.responseJSON.error.message)
        } else {
            alert("Error: Web API is not available... ")
        }
    });
}

Usage Examples

GET Request

Call this when you want to retrieve data from your Dataverse table.

appAjax({
    type: "GET",
    url: "/_api/crb50_tablename?$select=*&$filter=crb50_customerid eq '001' and crb50_password eq '001'",
    contentType: "application/json"
});


POST Request

Use this to insert a new record into Dataverse.

var recordObj = {
    "crb50_name": "Name1"
};

appAjax({
    type: "POST",
    url: "/_api/crb50_TableName",
    contentType: "application/json",
    data: JSON.stringify(recordObj),
    success: function (res, status, xhr) {
        recordObj.id = xhr.getResponseHeader("entityid");
        table.addRecord(recordObj);
    }
});


Conclusion

Integrating Power Pages with Dataverse Web API opens up a wide range of possibilities, from custom login experiences to fully dynamic data interactions. With a few key configurations in site settings and table permissions, and a secure AJAX wrapper like safeAjax, you can harness the full power of Dataverse from the front end of your Power Pages apps.

If you have any questions you can reach out our SharePoint Consulting team here.

Fixing SPFx Build Error: “UglifyJs Unexpected Token” When Running gulp bundle --ship

Introduction:

While packaging your SharePoint Framework (SPFx) solution using the production command: 

gulp bundle --ship

You may encounter a frustrating error message like: 

Unexpected token: name (corefeature) - SPFx while build solution

This usually indicates that UglifyJS, the default minifier in SPFx, stumbled upon ES6+ syntax (e.g., class, let, const) that it does not understand. 

In this post, I will guide you through a clean and effective workaround using terser-webpack-plugin, a modern minifier that fully supports ES6+. 

Why This Error Occurs:

  • Root Cause: UglifyJS does not support modern JavaScript (ES6+). 
  • Impact: Webpack fails during the minification process, stopping the bundle process for production. 
  • Trigger: Usage of ES6+ syntax like class, const, etc., in your SPFx web part code.  

Solution: Swap UglifyJS with Terser:

To resolve this, we will: 

  1. Add Terser and Webpack merge dependencies. 
  2. Update gulpfile.js to override the default SPFx Webpack configuration. 
  3. Clean and rebuild your project. 

Step-by-Step Fix:

Step 1: Install Compatible Dependencies:

Update your package.json to include: 

"terser-webpack-plugin-legacy": "1.2.3",
"webpack-merge": "4.2.1"

Then run the following commands in your terminal: 

npm install terser-webpack-plugin --save-dev
npm install terser-webpack-plugin-legacy --save-dev
npm install webpack-merge@4.2.1 --save-dev

Optional (if Babel is needed for ES6+ transpilation): 

npm install @babel/core @babel/preset-env babel-loader --save-dev

Step 2: Update gulpfile.js:

Modify your gulpfile.js as shown below: 

'use strict';
 
const gulp = require('gulp');
const build = require('@microsoft/sp-build-web');
const merge = require('webpack-merge');
const TerserPlugin = require('terser-webpack-plugin-legacy');
 
build.addSuppression(`Warning - [sass] The local CSS class 'ms-Grid' is not camelCase
  and will not be type-safe.`);

build.initialize(gulp);
 
build.configureWebpack.setConfig({
 additionalConfiguration: function (config) {
   config.plugins = config.plugins.filter(plugin => !(plugin.options && plugin.options.mangle));
 
   return merge(config, {
     optimization: {
       minimize: true,
       minimizer: [new TerserPlugin()]
     }
   });
 }
});

This code replaces UglifyJS with Terser during the bundle phase. 

Step 3: Clean & Rebuild the Project: 

Run the following commands in sequence: 

gulp clean
gulp build --ship
gulp bundle --ship

Your project should now build successfully, free of any “unexpected token” errors from UglifyJS. 

Optional: Babel Setup (Only If Needed): 

If your project uses newer JavaScript features not supported in your target environments, consider setting up Babel. However, for the UglifyJS error alone, swapping in Terser is typically enough. 

Conclusion: 

  • If you are getting the "UglifyJs Unexpected Token" error while bundling your SPFx project, it is because the default minifier does not support modern JavaScript. By switching to it terser-webpack-plugin, you can fix the issue and bundle your project without errors. Just follow the steps to update your packages and gulpfile, and you will be good to go!
  • If you run into any issues while implementing this solution, feel free to drop a comment. I will be happy to help. 

If you have any questions you can reach out our SharePoint Consulting team here.