June 29, 2023

Excel File Creation: Copying Header Structure from Existing Files


Introduction:

In this blog, we will learn how to create an Excel file based on the structure of an existing Excel file, without content, using Power Automate Flow. We will create a flow that generates a new Excel file inheriting the header structure from the parent file. Regardless of the number of columns or the specific headers in the parent Excel file, our automated flow will replicate the same column names and structure in the new Excel file.

Problem:

We encountered a scenario where we needed to create a separate Excel file with only the header structure from an existing file containing data. Unfortunately, Power Automate's Excel connector does not provide an out-of-the-box action to duplicate or copy a file structure.


Solution:

To address this challenge, we have developed a Power Automate flow that operates on an Excel file and generates another Excel file with only the header structure. These steps are applicable to any type of Excel file, regardless of the header structure. Hence, we can pass a dynamic header structure to the output file. Let's dive into the steps to achieve this:


Step 1:

First, we add the "For a selected file" trigger to initiate our flow. This trigger allows us to activate the flow when a specific file or item is selected within a SharePoint list or library.                               


Step 2:

In this step, we use the "Get table" action to fetch the table name from the existing parent Excel file. By using an expression to retrieve the file name dynamically, we ensure that the "Get table" action fetches the table name from the correct parent Excel file.                        


Step 3:

Next, we include the "List rows present in a table" action to fetch all the data from the Excel table. This action retrieves all the rows present within the specified table in the Excel file. We can dynamically set the "Table" value by retrieving the 'id' value from the first table obtained in the response of the "Get tables" action.

 first(outputs('Get_tables')?['body/value'])['id']  


Step 4:

We will add the "Compose" action and include the following expression:

 first(outputs('List_rows_present_in_a_table')?['body/value'])['id']  

This expression retrieves the first row of data from the output of the "List rows present in a table" action. By capturing the data from the first row using this expression within the "Compose" action, we store it for further processing.

 

Step 5:

Once we have obtained the output from the "Compose" action, we notice that it includes some unwanted properties such as "@odata.etag" and "ItemInternalId" above the actual data rows. To extract only the column names from the JSON object, we need to remove these unwanted properties.                                


To achieve this, we use the "Select" function, which allows us to manipulate the JSON object and filter out the unwanted properties to easily extract the desired column names.

The "From" value should be set to the following expression:

 json(replace(string(outputs('Compose_5')), '@odata.', 'odata'))  

Insert the following expression as the input value:

 removeProperty(removeProperty(item(), 'ItemInternalId'), 'odataetag')  

This expression removes the "ItemInternalId" and "odataetag" properties from each item in the JSON array.                                 


This is how we have obtained the desired outcomes using this select action:                              


Step 6:

Clicking on the "Next step" button, we choose the "Create CSV file" action. In the "From" field of the "Create CSV file" action, we add the output of the previous "Select" action. This ensures that we capture the modified JSON object with the desired column names and associated values.                            


Here, this is the expected output that we have obtained from the above action:                            

Step 7:

In this step, we utilize the "Compose" action to process the CSV table created in the previous step. Add the following expression in the "Inputs" field of the "Compose" action:

 first(split(body('Create_CSV_table'), decodeUriComponent('%0D%0A')))  

This expression splits the body of the "Create_CSV_table" action by the line break ("%0D%0A") and retrieves the first element. This allows us to isolate the header row of the CSV file, which consists of the column names. This allows us to isolate the header row of the CSV file, which consists of the column names.


Here, we have obtained the desired output, which includes the column names that will be utilized in our "Create table" action.


Step 8:

We add the "Create file" action to generate a blank Excel file. In the "File content" field of the "Create file" action, we enter a space to create a blank file.


Step 9:

To create a table in the blank Excel file and include the column names from the "Compose" action in Step 7, we follow these steps:

In the "Column names" field of the "Create table" action, we enter the output of the "Compose" action from Step 7, which is "Compose 6".

By including the column names from the "Compose" action, we ensure that the table created in the Excel file will have the correct column headers.


Finally, we save and run the flow manually. You can see the flow generating a new Excel file with the exact same table structure as the parent file.


Conclusion:

This blog presented a straightforward method to dynamically create a new Excel file with the same header structure as the parent table, without any data. By leveraging Power Automate Flow and SharePoint, we achieved this outcome seamlessly.

If you have any questions you can reach out our SharePoint Consulting team here.

June 22, 2023

Step-by-Step Guide: Unlocking the Power of AI with Azure OpenAI for Custom Data Models

Introduction:

OpenAI has revolutionized the world, prompting many individuals to seek its integration into their own data models. However, concerns regarding data privacy, security, and governance have arisen when utilizing OpenAI models. These considerations hold paramount importance in ensuring the responsible use of AI technologies.

Thankfully, there is a solution: Azure OpenAI Cognitive Services. Microsoft has introduced Azure OpenAI, which enables the deployment of OpenAI models for customized applications while benefiting from the robust security and governance measures provided by Microsoft.

In this article, we will explore the utilization of Azure OpenAI Services to access the ChatGPT model (gpt-3.5-turbo) and leverage Azure Cognitive Search for efficient data indexing and retrieval. Through these technologies, we will demonstrate how to create ChatGPT-like experiences using custom data, all while upholding the necessary privacy, security, and governance protocols.


Prerequisites:

  1. Access to OpenAI Service on Azure
     - Please note that access to Azure OpenAI is currently limited. If your tenant doesn't have access, you can apply for it here.
  2. Your Azure Account must have the necessary permissions, such as Microsoft.Authorization/roleAssignments/write operation permissions (E.g., User Access Administrator or Owner). 
  3. Azure Developer CLI
  4. Python 3+
     - Make sure you can run python --version from the console.

  5. Node.js
  6. Git
  7. PowerShell 7+ (pwsh)
     - Verify that you can run "pwsh.exe" from a PowerShell command. If this fails, it's likely that you need to upgrade your PowerShell version.

Installation:

To install and set up the necessary components, follow these steps:
  1. Create a new folder and open it in the Command Prompt.
  2. Run the command "azd auth login". This will open a browser window for authentication. Enter your Azure credentials in the browser window.
     - 
    Note: Once the authentication is complete, you can close the browser window.
  3. You should see a message like "Logged in to Azure" in the terminal window, indicating a successful authentication:

  4. Run the command "azd init -t azure-search-openai-demo". This command will initialize a git repository.
  5. When prompted for a new environment name, you can either keep the default name or change it as needed. Press Enter to keep the default name.
  6. After this step, you will see a message like "SUCCESS: New project initialized!":


You have now completed the installation and initialization process.

Custom Data Files:

To incorporate your own custom data files (such as PDFs, Word Documents, etc.) into your data models, follow these steps:
  1. Navigate to the folder where the project is initialized. Inside this folder, you will find a directory called "data".

  2. Access the "data" folder and you will find sample data files already present. You can replace or copy your own files into this folder.

Ensure that your custom files are placed within this "data" folder for the project to access and utilize them effectively.

Azure Deployment:

Now that everything is set up locally, we can proceed with deploying the project on Azure. Follow the steps below:

  1. Open the terminal.
  2. Run the command "azd up".
  3. Select the appropriate subscription and press Enter:

  4. Choose the desired region for deployment. Note that the regions currently supporting the models used in this sample are East US or South Central US. For an up-to-date list of regions and models, you can refer to the documentation here.

  5. The deployment process will begin, which includes packaging up the services and deploying them to Azure. Please be patient as this process may take some time.
  6. Once the deployment is complete, you will see a message in the terminal similar to:

Additionally, you will be provided with an endpoint. Opening this endpoint in a browser will take you to the ChatGPT experience, where you can ask questions to the AI, and it will respond using the custom data files as the source of information and models.

Congratulations! Your application is now deployed on Azure and ready to be accessed via the provided endpoint.

Reference:

Conclusion:

Azure OpenAI Cognitive Services provides a secure and efficient solution for integrating OpenAI models into custom data models. With the power of Azure Cognitive Search and Microsoft's robust security measures, organizations can deploy ChatGPT-like experiences over custom data while ensuring data privacy and governance. By leveraging Azure OpenAI, businesses can unlock the potential of AI technologies and create innovative applications with confidence.

If you have any questions you can reach out our SharePoint Consulting team here.

Resolving Attachment Issues in Power Automate Approval Action - 'Start and Wait for an Approval': Step-by-Step Guide


Introduction: 

Recently, while working on a Power Automate Flow, we encountered a specific requirement involving the need to send attachments in an Approval action. To address this, we utilized the “Start and Wait for an approval” action. Within this action, we clicked on “Show advanced options” to configure the attachments field from the dynamic contents array variable that we had created for appending the attachment contents.




Here we passed the ApprovalAttachments array consisting of JSON objects with two properties “Name” and ContentBytes. 

 {   
  "Name": @{items('Apply_to_each_2')?['DisplayName']},   
  "ContentBytes": @{body('Get_attachment_content')}   
 }   




The flow was executed without any errors, and I received the expected emailHowever, the attached files were missing from the email.  


It became evident that the attachments were not being successfully included in the email.  Therefore, in this blog, we will thoroughly examine the procedure for sending multiple attachments using the “Start and Wait for an Approval action within Power Automate. Our focus will be on resolving the specific issue that arises when attempting to send attachments using this particular action. Let’s dive into the step-by-step process and find a solution to this problem! 


Solution: 

Step 1: Initialize the array variable. 




Step 2: "Get attachments" action in Power Automate returns all attachments for given item in the array.



 

Step 3: To read all the attachment contents, click on the next step and then select the "Get attachment content". 




Step 4: Proceed to the next step and select theAppend to array variableaction inside the apply to each loop. 

- Now select the variable ApprovalAttachment that you have previously created

- Configure the value using the following JSON structure: 

  • Name: Set the display name from the dynamic content. 

  • Content: Set the attachment content of the Get attachment contents from the dynamic content. 


   {   
               "Name": @{items('Apply_to_each_2')?['DisplayName']},   
               "Content": @{body('Get_attachment_content')}   
   }   

It is important to note that you should use the “Content” instead of “Content Bytes” to avoid any issue in sending attachments. Now you will get an email with attachments too. 



Step 5: In this final action, go to the advanced options and set the dynamic content from the approval attachments array. 


 

Conclusion: 

So, this is how we can easily send attachments without encountering any issues using the approval action. 


If you have any questions you can reach out our SharePoint Consulting team here.