June 16, 2017

How to escape a special character in filter parameter using REST API

Scenario: 
Recently working with REST API to get items from SharePoint list filtered by title field, I got stuck. it was working fine while having data without special character. But having special character - single quote/apostrophe (') in filter parameter was giving an error. I tried by passing value using EncodeURIComponent, but it didn't work either and getting the error as shown below:


Reason: 
EncodeURIComponent, Escape or EncodeURI functions can’t escape few special characters: - _ . ! ~ * ' ( )
 
Solution:
For such special characters as filter parameter, we should double the character (2 single quotes) and use it.

Example:
Non-working REST API:
https://{Site URL} /_api/web/lists/GetByTitle('listname')/items? select=ID&$filter=Title eq 'what's up'

Working REST API:
https://{Site URL} /_api/web/lists/GetByTitle('listname')/items? select=ID&$filter=Title eq 'what''s up'

If you have any questions you can reach out our SharePoint Consulting team here.

Create a subsite programmatically using Custom Site Template through SharePoint JavaScript Object Model and REST API

Here, I'll explain you in detail how to create SharePoint sub-site programmatically using custom Site Templates through JSOM and REST API.
 
Implementation Approach: First, we need to identify Web Template Id for our custom template. programmatically. And then, we will a create sub-site using the Web Template Id.

Let's go through the code snippet, now. We have two functions in code snippet:

1. CreateSubsiteByTemplateName(title, description, webUrl, templateTitle) 
- This function will create a sub-site by Template Name. First of all, it will find out the Template Id from Template Name, and then, will call another function to create the sub-site.
Parameters Information:
    1. title= name of sub-site which you want to create Ex: "subsite1"
    2. description = description for sub-site
    3. weburl = URL for sub-site Ex: "subsite1"
    4. templateTitle= Name of custom template Ex: "Physicians"

2. CreateSubsiteByTemplateId(title, description, webUrl, templateId)
- This function will create a sub-site by Template Id.
Parameters Information:
    1. title= name of sub-site which you want to create Ex: "subsite1"
    2. description = description for sub-site.
    3. weburl = URL for sub-site Ex: "subsite1"
    4. templateId= Id of custom template Ex: "{D5729655-B3D8-4DED-B5E9-3EE09934FC80}#Physicians"

Code Snippet 1:
 function CreateSubsiteByTemplateName(title, description, webUrl, templateTitle) {   
   var context = new SP.ClientContext.get_current();   
   var web = context.get_web();   
   context.load(web);   
   var webTemplates = web.getAvailableWebTemplates(1033, false);   
   context.load(webTemplates);   
   context.executeQueryAsync(function () {   
    var enumerator = webTemplates.getEnumerator();   
    var templateId = "STS#0";   
    while (enumerator.moveNext()) {   
     var webTemplate = enumerator.get_current();   
     var webTitle = webTemplate.get_title();   
     if (webTitle == templateTitle) {   
      templateId = webTemplate.get_name();  
      break;   
     }   
    }   
    CreateSubsiteByTemplateId(title, description, webUrl, templateId);   
   },   
    function (sender, args) {   
     alert(args.get_message())   
    }   
   );   
  }  

Code Snippet 2:
 function CreateSubsiteByTemplateId(title, description, webUrl, templateId) {    
   var restAPIURL = "/_api/web/webinfos/add";    
   var newSiteData = JSON.stringify(    
   {    
   'parameters': {    
    '__metadata': {    
    'type': 'SP.WebInfoCreationInformation'    
    },    
    'Url': webUrl,    
    'Description': 'Subsite created from REST API',    
    'Title': title,    
    'Language': 1033,    
    'WebTemplate': templateId,    
    'UseUniquePermissions': true    
   }    
   });    
   $.ajax    
   ({    
   url: restAPIURL,    
   type: "POST",    
   async: false,    
   data: newSiteData,    
   headers: {    
    "accept": "application/json;odata=verbose",    
    "content-type": "application/json;odata=verbose",    
    "X-RequestDigest": $('#__REQUESTDIGEST').val()    
   },    
   success: function (data) {    
    console.log('site created');    
   },    
   error: function (data) {    
    console.log('Error creating site');    
   }    
   });    
  }    

If you have any questions you can reach out our SharePoint Consulting team here.

April 18, 2017

Page-break-before doesn't work with IE.

I came across the requirement to print web page using JavaScript. Normally, If we need to add page break while printing the web page, we use "page-break-before" property in CSS. But "page-break-before" does not work with Internet Explorer (IE) browser. Here, I'll show you how we can achieve the desired behavior.

Resolution:

Below is the HTML snippet which is generated using JavaScript. Here, we want to add page break before each Div element. So, I've used "page-break-before:always" style property in each Div element. It will break the page while printing web page. It works fine with "Chrome" or "Mozilla" browsers but it doesn't work with Internet Explorer (IE) browser.

<h1>Page Title</h1>
<!-- content block -->
<!-- content block -->
<div style="page-break-before:always;"></div>
<!-- content block -->
<!-- content block -->
<div style="page-break-before:always;"></div>
<!-- content block -->
<!-- content block -->
<!-- content -->
To have "page-break-before:always" property working in all browsers including Internet Explorer, we need to add one extra Div tag (highlighted) with empty space ("&nbsp;") as shown in below code snippet.

<h1>Page Title</h1>
<!-- content block -->
<!-- content block -->
<div style="page-break-before:always;"></div>
<div>&nbsp;</div>
<!-- content block -->
<!-- content block -->
<div style="page-break-before:always;"></div>
<div>&nbsp;</div>
<!-- content block -->
<!-- content block -->
<!-- content -->
If you have any questions you can reach out our SharePoint Consulting team here.

April 13, 2017

Power BI – Develop BI Reports from Wikipedia using Power BI

Power BI is a great business analytics service provided by Microsoft. It provides wide range of interactive visualizations with self-service business intelligence capabilities for analytical reports and dashboard development. Now, for analytical reports and dashboard development, source of data is very important. Power BI is flexible enough to consume data from variety of data sources.

We can also develop Power BI Reports & Dashboard using Wikipedia as the source of data. Let’s create a report consuming Notable firms in India.


We will develop this report in Power BI Desktop. First, we need to connect to data source in Power BI Desktop.

Connect to Data Source (Notable firms in India):

1. Open Power BI Desktop, and click “Get Data”:

2. Select “Web” as data source and click “Connect”:

3. Specify Wikipedia Page URL and click “OK”:

4. Select appropriate Page URL and click “Connect”:

5. As we want to develop report for Notable Firms in India, load the data for “Notable Firms”:

So, now data is loaded to Power BI Desktop.


Develop the Report (Notable firms in India):

We will develop the report that will help us to visualize the companies by industry and headquarters location.

1. Select a Pie Chart from Visualizations and configure as shown in below image so that we will be able to get company count by industry:

2. Select Slicer Tool and configure as shown in below image so that we will be able to filter the data by company headquarter location:

3. Select a Matrix and configure as shown in below image to display the data:

4. Select a Stacked Column Chart and configure as shown in below screenshot so that we will be able to see visual of companies by headquarter location:

5. Now, we have configured the visuals and report will appear as shown in below image:

 We can use Pie Chart, Slicer and Stacked Chart to filter the data by Industry and Company Headquarters Location.

Publish the Report (Notable firms in India):

 We can publish the developed report to web. To publish the report to web, we must have Power BI account. If you don’t have account, you can sign-up here.

1. Click “Publish” in ribbon bar and sign in as your account in Power BI Desktop:

2. Enter your credentials and click “Sign In”:

3. Select “My workspace” as destination to publish the report.

4. So, report is published to Power BI web. Please click the provided link to open the report on web.

5. Report is opened now on the web:


 For anonymous access to this report, I have created embed code and hence if you wish, you can access this report here.

If you have any questions you can reach out our SharePoint Consulting team here.

March 9, 2017

Add/Move SharePoint 2013 farm Search Topology in Single or Multiple Search Servers using PowerShell

SharePoint 2013 farm provides Search Topology that comprises Admin, Crawler, Content Processing, Analytics Processing, Query Processing and Index Partition.

  • Search Administration Component administers new instances of Search components and Search processes.
  • Search Crawler enables us to crawl all the content in our site thereby allowing us to retrieve results via SharePoint Search. 
  • Search Content Processing allows us to process different types of contents and indexing them in a single index.
  • Search Analytics Processing component of Search topology deals with analytics related stuffs. It carries out search analytics and usage analytics.
  • Search Query Processing component handles incoming queries and returns suitable results.

Scenario 1: If you want to move SharePoint 2013 farm Search Topology from one server (e.g. Server1) to another (e.g. Server2), then follow below mentioned steps:

Prerequisites:
 - Search service application must be created and configured in the farm.
 - Search Topology must have been configured in Server1.
 - Server2 must be part of  SharePoint 2013 farm.
 - Ensure that no search crawl is running and search index is empty.

Connect to any of your SharePoint 2013 farm server and run SharePoint Management Shell as an Administrator. 

Run below given commands in SharePoint Management Shell one by one :

$hostA = Get-SPEnterpriseSearchServiceInstance -Identity "Server2"
# Note: "Server2" is new server name, where we want to move Search Topology.

Start-SPEnterpriseSearchServiceInstance -Identity $hostA
Get-SPEnterpriseSearchServiceInstance -Identity $hostA
$ssa = Get-SPEnterpriseSearchServiceApplication
$newTopology = New-SPEnterpriseSearchTopology -SearchApplication $ssa
New-SPEnterpriseSearchAdminComponent -SearchTopology $newTopology -SearchServiceInstance $hostA
New-SPEnterpriseSearchCrawlComponent -SearchTopology $newTopology -SearchServiceInstance $hostA
New-SPEnterpriseSearchContentProcessingComponent -SearchTopology $newTopology -SearchServiceInstance $hostA
New-SPEnterpriseSearchAnalyticsProcessingComponent -SearchTopology $newTopology -SearchServiceInstance $hostA
New-SPEnterpriseSearchQueryProcessingComponent -SearchTopology $newTopology -SearchServiceInstance $hostA
New-SPEnterpriseSearchIndexComponent -SearchTopology $newTopology -SearchServiceInstance $hostA -IndexPartition 0
Set-SPEnterpriseSearchTopology -Identity $newTopology
Get-SPEnterpriseSearchTopology -SearchApplication $ssa
Get-SPEnterpriseSearchStatus -SearchApplication $ssa -Text

Finally, Search Topology will be migrated from "Server1" to "Server2", and you will be able to see Search Topology components in "Server2".

*After moving Search Topology to new server, start full crawl to get search results.

Scenario 2: If you want to add an additional search server (e.g. Server2),  along with existing search server -- Server1, then follow below mentioned steps:

Prerequisites:
 - Search service application must be created and configured in the farm.
 - Search topology must have been configured in Server1.
 - Server2 must be part of  SharePoint 2013 farm.
 - Ensure that no search crawl is running and search index is empty.

Connect to any server of SharePoint 2013 farm and run SharePoint Management Shell as an Administrator. 
Run below given commands in management shell one by one :

$hostA = Get-SPEnterpriseSearchServiceInstance -Identity "Server2"
# Note: "Server2" is new server name, which we are going to add.

Start-SPEnterpriseSearchServiceInstance -Identity $hostA
Get-SPEnterpriseSearchServiceInstance -Identity $hostA
$ssa = Get-SPEnterpriseSearchServiceApplication
$newTopology = New-SPEnterpriseSearchTopology -SearchApplication $ssa
New-SPEnterpriseSearchAdminComponent -SearchTopology $newTopology -SearchServiceInstance $hostA
New-SPEnterpriseSearchCrawlComponent -SearchTopology $newTopology -SearchServiceInstance $hostA
New-SPEnterpriseSearchContentProcessingComponent -SearchTopology $newTopology -SearchServiceInstance $hostA
New-SPEnterpriseSearchAnalyticsProcessingComponent -SearchTopology $newTopology -SearchServiceInstance $hostA
New-SPEnterpriseSearchQueryProcessingComponent -SearchTopology $newTopology -SearchServiceInstance $hostA
New-SPEnterpriseSearchIndexComponent -SearchTopology $newTopology -SearchServiceInstance $hostA -IndexPartition 1
# Note: For Adding second Search Server Topology, we must have to give -IndexPartition as 1. Because, "Server1" would already have 0 as IndexPartition.

Set-SPEnterpriseSearchTopology -Identity $newTopology
Get-SPEnterpriseSearchTopology -SearchApplication $ssa
Get-SPEnterpriseSearchStatus -SearchApplication $ssa -Text

Finally, You will be able to see newly added search servers in SharePoint 2013 farm.

*After adding an additional search server in SharePoint 2013, start full crawl to get search results.

Scenario 3: If you want to add multiple search servers (e.g. Server1 and Server2) at a time, then follow below mentioned steps:

Prerequisites:
 - Search service application must be created and configured in the farm.
 - Server1 and Server2 must be part of  SharePoint 2013 farm.
 - Ensure that no search crawl is running and search index is empty.

Connect to any server of SharePoint 2013 farm and run SharePoint Management Shell as an Administrator. 

Run below given commands in management shell one by one :

$hostA = Get-SPEnterpriseSearchServiceInstance -Identity "Server1"
$hostB = Get-SPEnterpriseSearchServiceInstance -Identity "Server2"
# Note: "Server1" and "Server2" are new servers, which we are going to add.

Start-SPEnterpriseSearchServiceInstance -Identity $hostA
Start-SPEnterpriseSearchServiceInstance -Identity $hostB
Get-SPEnterpriseSearchServiceInstance -Identity $hostA
Get-SPEnterpriseSearchServiceInstance -Identity $hostB
$ssa = Get-SPEnterpriseSearchServiceApplication
$newTopology = New-SPEnterpriseSearchTopology -SearchApplication $ssa
New-SPEnterpriseSearchAdminComponent -SearchTopology $newTopology -SearchServiceInstance $hostA
New-SPEnterpriseSearchCrawlComponent -SearchTopology $newTopology -SearchServiceInstance $hostA
New-SPEnterpriseSearchContentProcessingComponent -SearchTopology $newTopology -SearchServiceInstance $hostA
New-SPEnterpriseSearchAnalyticsProcessingComponent -SearchTopology $newTopology -SearchServiceInstance $hostA
New-SPEnterpriseSearchQueryProcessingComponent -SearchTopology $newTopology -SearchServiceInstance $hostA
New-SPEnterpriseSearchIndexComponent -SearchTopology $newTopology -SearchServiceInstance $hostA -IndexPartition 0

New-SPEnterpriseSearchAdminComponent -SearchTopology $newTopology -SearchServiceInstance $hostB
New-SPEnterpriseSearchCrawlComponent -SearchTopology $newTopology -SearchServiceInstance $hostB
New-SPEnterpriseSearchContentProcessingComponent -SearchTopology $newTopology -SearchServiceInstance $hostB
New-SPEnterpriseSearchAnalyticsProcessingComponent -SearchTopology $newTopology -SearchServiceInstance $hostB
New-SPEnterpriseSearchQueryProcessingComponent -SearchTopology $newTopology -SearchServiceInstance $hostB
New-SPEnterpriseSearchIndexComponent -SearchTopology $newTopology -SearchServiceInstance $hostB -IndexPartition 1
# Note: For adding an additional search server in farm, we must give -IndexPartition 1

Set-SPEnterpriseSearchTopology -Identity $newTopology
Get-SPEnterpriseSearchTopology -SearchApplication $ssa

Get-SPEnterpriseSearchStatus -SearchApplication $ssa -Text

Finally, You will be able to see newly added search servers in SharePoint 2013 farm.

*After adding an additional search server in SharePoint 2013, start full crawl to get search results.

If you have any questions you can reach out our SharePoint Consulting team here.

Users with Full Control permission are not able to create sub-site in SharePoint

Problem Statement:
I spent lot of time to fix a strange issue while working with SharePoint 2013 project. Issue was a user having Full Control permission in SharePoint Top level site was not able to create a sub-site. Each time user tried to create either a "Project Site" or "Team Site" user got “Sorry, you don’t have access to this page” or “Access Denied” error. This was the identical behavior for all the users having Full Control permission.

Problem Symptoms:
      1. It appeared SharePoint was ignoring the Full Control access permission.
      2. User with site collection administrator permissions could create sub-sites.
      3. New site collections on the same web application operated normally without this issue.

Root Cause:
     1. Internally, SharePoint manages a hidden list named "TaxonomyHiddenList". URL of the list is -  "[sitecollectionURL]/lists/taxonomyhiddenlist".
     2. Generally, All authenticated users (everyone) has Read access to this list.
     3. As a site collection administrator user, when I checked permissions of the list, I found none of user or group were having permissions to this list as you can see in below screenshot.


Resolution: The solution was simple once identified. We should assign read access to all authenticated users (Everyone) operating within the site collection.

I had spent more time to fix the issue as it was a difficult one to identify, Hope this helps you out to overcome it more efficiently.

If you have any questions you can reach out our SharePoint Consulting team here.

March 7, 2017

Use Session Storage Object over Local Storage in JavaScript

In few instances, we might come across such requirement where we have to retain/store variable values specific to browser tab over successive post back. Normally, If we open same HTML page in multiple tabs in same browser, then variable value retained over successive post back would be same across all opened tabs. But if requirement is to have different variable values for each opened tab then how would we handle? Lets go through it:

Resolution: As we know, there are two options available in JavaScript to retain values over successive post back.

1. Local Storage
2. Session Storage

We can use Local Storage for saving data in browser over successive post back but there are certain limitations. It will save data in browser but data will remain same for all tabs in browser. In such scenario, where we need separate values for each browser tab, we have to use Session Storage object in JavaScript.

Local Storage:

It can store data locally within the user's browser.

Storage limit is far larger (at least 5 MB) and information is never transferred to the server.

Local storage is per origin (per domain and protocol). All pages, from one origin, can store and access the same data.

Syntax & Examples for Local Storage:

How to store the value to Local Storage in JavaScript?

Syntax: localStorage.setItem("VariableName", "Value");
Example: localStorage.setItem("BR", "Binary Replublik");

How to retrieve value from Local Storage variable in JavaScript?

Syntax: localStorage.getItem("VariableName") // Returns Object.
Example: document.getElementById("BRTeam").innerHTML = localStorage.getItem("BR");

How to remove Local Storage variable?

Syntax: localStorage.removeItem("VariableName");
Example: localStorage.removeItem("BR");

Session Storage

The Session Storage object is equal to the Local Storage object, except that it stores the data for only one session. So, the value retained with Session Storage is browser tab specific.

The data will be deleted when the user closes the specific browser tab.

Syntax & Examples for Session Storage:

How to store the value to Session Storage in JavaScript?

Syntax: sessionStorage.setItem("VariableName", "Value");
Example: sessionStorage.setItem("BR", "Binary Replublik");

How to retrieve value from Session Storage variable in JavaScript?

Syntax: sessionStorage.getItem("VariableName") //Returns Object.
Example: document.getElementById("BRTeam").innerHTML = sessionStorage.getItem("BR");

How to remove Session Storage variable?

Syntax: sessionStorage.removeItem("VariableName");
Example: sessionStorage.removeItem("BR");

Conclusion: To store variable value specific to browser tab, we have to deal with session storage over local storage.

If you have any questions you can reach out our SharePoint Consulting team here.

February 16, 2017

Nintex Workflow: Access denied. You do not have permission to perform this action or access this resource.

Working with Nintex workflow, I was trying to send an email through "Send Email" action to users within SharePoint Group. If user through which workflow is kicked off, is part of that SharePoint Group then workflow/email works fine. But If user is not added in that group, getting below error: (Even that user has full control permission on site.)

"Access denied. You do not have permission to perform this action or access this resource."

Cause:
Basically, when "Send Email" action executes, Nintex try to get users information to whom email needs to be sent out. If logged in user is added in the SharePoint group, Nintex is able to retrieve the information. But if user is not part of the group, it won't be able to access member information of the group.

Resolution:
To overcome this situation, Allow everyone to see the members of the SharePoint group. To make this change, follow below steps:


Note: This will enable other users who are not part of the group to view all  members of  the group.

  • Go Site Settings --> People & Groups, browse to that SharePoint Group.

  • From settings menu, select Group Settings.
  • Update Group settings --> view permission to "Everyone".
  • Click OK.

If you have any questions you can reach out our SharePoint Consulting team here.

January 10, 2017

How to improve SEO ranking of the website.

An important part of promoting your website online is to have it listed by search engines in their search results. The higher positions in the search engine results you get, the more clicks and traffic you will have.
 
The key to having good rankings in free searches is the so-called “Search Engine Optimization” (for short SEO) which starts by having your site indexed by search engines, goes through optimizing the content for search engines and then building valuable links to it. Google has the most dynamically changing algorithm to consider for search engine optimization.
 
There are common issues which are need to be fixed to increase SEO ranking: -

Meta Title:
  • Your page’s meta title is an HTML tag that defines the title of your page. This tag displays your page title in search engine results, at the top of a user’s browser, and also when your page is bookmarked in a list of favorites. A concise, descriptive title tag that accurately reflects your page’s topic is important for ranking well in search engines.
  • The meta title of your page has a length of 71 characters. Most search engines will truncate meta titles to 70 characters.

Meta Description:
  • Your page’s meta description is an HTML tag that is intended to provide a short and accurate summary of your page. Search engines use meta descriptions to help identify the a page’s topic - they may also use meta descriptions by displaying them directly in search engine results. Accurate and inviting meta descriptions can help boost both your search engine rankings and a user’s likelihood of clicking through to your page.
  • The meta description of your page has a length of 252 characters. Most search engines will truncate meta descriptions to 160 characters.

<h1> Headings Status:
  • Check if any H1 headings are used in your page. H1 headings are HTML tags that are not visible to users, but can help clarify the overall theme or purpose of your page to search engines. The H1 tag represents the most important heading on your page, e.g., the title of the page or blog post.
 
<h2> Headings Status:
  • Check if any H2 headings are used in your page. H2 headings are HTML tags that are not visible to users, but can help clarify the overall theme or purpose of your page to search engines. The H2 tag represents the second most important headings on your page, e.g., the subheadings.
 
Robots.txt Test:
  • Check if your website is using a robots.txt file. When search engine robots crawl a website, they typically first access a site’s robots.txt file. Robots.txt tells Google-bot and other crawlers what is and is not allowed to be crawled on your site. Read More

Sitemap Test:
  • Check if the website has a sitemap. A sitemap is important as it lists all the web pages of the site and let search engine crawlers to crawl the website more intelligently. A sitemap also provides valuable metadata for each webpage.
 
SEO Friendly URL Test:
  • Check if your webpage URLs are SEO friendly. In order for links to be SEO friendly, they should contain keywords relevant to the page’s topic, and contain no spaces, underscores or other characters. You should avoid the use of parameters when possible, as they make URLs less inviting for users to click or share. Google’s suggestions for URL structure specify using hyphens or dashes (-) rather than underscores (_). Unlike underscores, Google treats hyphens as separators between words in a URL.
 
Image Alt Test:
  • Check if images on your webpage are using alt attributes. If an image cannot be displayed (e.g., due to broken image source, slow internet connection, etc.), the alt attribute provides alternative information. Using relevant keywords and text in the alt attribute can help both users and search engines better interpret the subject of an image.

Inline CSS Test:
  • Check your webpage HTML tags for inline CSS properties. Inline CSS property are added by using the style attribute within specific HTML tags. Inline CSS properties unnecessarily increase page size, and can be moved to an external CSS stylesheet. Removing inline CSS properties can improve page loading time and make site maintenance easier.

Favicon Test:
  • Check if your site is using and correctly implementing a favicon. Favicons are small icons that appear in your browser’s URL navigation bar. They are also saved next to your URL’s title when your page is bookmarked. This helps brand your site and make it easy for users to navigate to your site among a list of bookmarks.

JS Error Checker:
  • Check your page for JavaScript errors. These errors may prevent users from properly viewing your pages and impact their user experience. Sites with poor user experience tend to rank worse in search engine results.

CSS/JS Minification Test:
  • Checks if any external JavaScript or CSS files used in your page are minified. Minified files reduce page size and overall load time.
 
Directory Browsing Test:
  • Check if your server allows directory browsing. If directory browsing is disabled, visitors will not be able to browse your directory by accessing the directory directly (if there is no index.html file). This will protect your files from being exposed to the public. Apache web server allows directory browsing by default. Disabling directory browsing is generally a good idea from a security standpoint.
 
Plaintext Emails Test:
  • Check your webpage for plaintext email addresses. Any e-mail address posted in public is likely to be automatically collected by computer software used by bulk emailers (a process known as e-mail address harvesting). A spam harvester can read through the pages in your site and extract plaintext email addresses which are then added to bulk marketing databases (resulting in more inbox spam). There are several methods for email obfuscation.
Solution: In HTML section, write a tag with random href and In JavaScript file replace href attribute  with original email address. 
 
<a href="mail" id="linkMail"><img src="~/images/mailtoicon.png" alt="Mailto" /></a>
 
$("#linkMail").attr("href", mailto:info@binaryrepublik.com);
 
 
Micro-Data Schema Test
  • Check if your website uses HTML Micro-Data specifications (or structured data markup). Search engines use micro-data to better understand the content of your site and create rich snippets in search results (which helps in increasing click-through rate to your site). Read More

If you have any questions you can reach out our SharePoint Consulting team here.

Deep Dive into Data Visualization Techniques

What is Data Visualization?
Data visualization is terminology that helps us to understand importance of data by visual/graphical/tabular representation.

Data is time variant and as time passes, data from different sources is collected and processed/analyzed. And, this processed/analyzed data enables decision makers at different levels to gain better visibility on various business aspects such as market trends, organization’s revenue, profit percentage over past years etc.

As common human mentality, people concentrates more on data represented in charts and graphs compare to long pages, bunch of papers etc.
 
Why Data Visualization is Important?
Visualization helps people to analyze data in pictorial or graphical manner. Even the data volumes are very large, patterns and design can be spotted quickly. Visualization convey information easily and it makes it easy to understand.

For example, it is difficult to extract information from normal spreadsheet. It is also time consuming process. Data visualization presents data in a way that, it becomes easy for the user to interpret and analyze data.

Main reason behind the data visualization is that, it provides the past and current trend of data. In a way, it becomes easy for decision makers to take decisions with the help of visualization of historical data.

Selecting Solution for Visualization
As we all know that, if any organization is able to produce and visualize data in effective and correct manner, and if they are able to produce the accurate information then it is the key for doing successful business and enable decision makers to take accurate and right decision.

There are three questions that should be answered before selecting visualization solution:
    1.  What do we want to do with our data?
    2.  Which devices are users consuming the data on?
    3.  Where is our data located?
 
Microsoft Technologies Supporting Tool for Visualization
There are several tools provided by Microsoft that can be helpful in visualizing data in correct manner. All these tools have their own advantages and targeted for specific user’s skill set.

Selection of visualization tool also depends on the organization requirements and the targeted user to whom data will be delivered.

    -  Excel Services
    -  PowerPivot
    -  SSRS
    -  Power View
    -  Power BI
    -  Performance Point
    -  Visio Services

How to Select Right Tool?
There are several use cases that will help organization in selection of tool.

Use Case 1: If user is an Excel Pro. And organization have lot of data and have SharePoint On-Premise and need to provide and share information to many users on intranet.

Solution is Power Pivot:
    -  It is part of the Excel family (Add-in).
    -  We can build a Pivot table from multiple table (multiple sources).
    -  We can build relationships between these tables in GUI.
    -  Enables the ability to quickly process millions of record.

Use Case 2: Organization have SharePoint On-Premise and they want users to do data analysis and discovery on the intranet on their own.

Solution is Power View:
    -  Power View can be used to explore data, to analyze, sort and to filter the data.
    -  It can also be used to discover relationships and to spot any trends.
    -  It provides interactive data.

As we have seen above use cases, but those are just for example. Selection of tool is based on business requirements and needs of users.

Performance Point vs Power Pivot vs Power View vs Excel vs SSRS
1.  Performance Point:
With Performance Point, we have advantage of advanced drill down on complex cube.

It's better to use Performance Point, if our report is based on certain measures and indicator using Key Performance Indicator.

Performance Point reports are supported on iPad, we just need to hold our finger on any chart and our report will be drill down easily.

2.  Power View:
Power View report runs on Silverlight and this can be considered as disadvantage due to it's incompatibility on iPad.

Power View enables self service report creation for end user. So, user can create report based on Tabular Model Cube and Power View will manage connection to model automatically.

If you would like to merely get a grid of information from the source (cube), Power View is that the only one that will do it. Whereas Excel Services provides view into cubes, that needs to be viewed in Pivot Table.

3.  Excel Services:
Excel services provide many rich feature to visualize our data using Pivot Table, Charts, Slicers etc.

The graphing motor of Excel Services is the best out of all the others. It gives the most choices for designing, diagram sorts, 2D/3D graphs, information names, and colors.

4. Power Pivot:
In Power Pivot, we can pull database up to 4 GB into Excel Sheet by creating data model and then, we can use that model to create reports.

So, basically it will allow users to view data in disconnected mode, in other words, user can carry data with them.

5. Reporting Services (SSRS):
In the event that we require a report to be printed to paper, we might need to have tight control over how the report looks and this is the place where SSRS is truly sparkles, it has capacity to make pixel flawless reports and control things like page edges and widths.

On the off chance that we have reports that should be emailed on a timetable, this should be possible utilizing SSRS. This is helpful in the event that we need a day by day or week after week report to be sent via email.

With SSRS,we can export report to DOCX, XLSX, PDF, MHTML formats.

If you have any questions you can reach out our SharePoint Consulting team here.

January 4, 2017

Mail Address not getting synchronized in SharePoint 2013

People might have stumbled upon issue of mails not getting synchronized in SharePoint 2013 and might be wondering why is it so and how do we fix it? Here, I'll walkthrough in detail in order to fix the issue.
 
Cause: this issue is raised because User Profile Service application is trying to sync Work Email with AD attribute "proxyAddresses" instead of "mail" attribute.

First of all, make sure User Profile Service and User Profile Synchronization Service are started.

Now, follow below Steps:
  • Go to User Profile Service Application:
    • Manage Service Applications --> User Profile Service Application.
  • Click on the "Manage User Properties" under People section.
                                     
  • In Manage User Properties page, navigate to "Work email" property under Contact Information section. Click on Edit.
  • In Edit User Profile Property page, Navigate to Property Mapping for Synchronization section, and remove proxyAddresses mapping.
 
  • Now, Go to Add New Mapping section, and select "mail" from Attribute drop down and click on Add.
  • You can see "mail" attribute in Property Mapping for Synchronization section.
  • In Edit User Profile Property page, click on OK to apply changes.
Start Full Synchronization and wait for Profile Synchronization Status to get "Idle". Now, you can search for any profile on Manage User Profiles page, you will see mail addresses will be filled in for users.


If you have any questions you can reach out our SharePoint Consulting team here.

How to create a Custom Plugin in Redmine

Redmine is a free and open source, flexible web based Project management and Issue tracking tool written in Ruby on Rails framework. It is cross-platform and cross-database. We can create our own custom plugin and deploy it in redmine. Here are the steps:

Step 1:
Open Command Prompt For Redmine (Note: Don’t Use normal command prompt of windows.)
To Open command Prompt,
Start -> All Programs->Bitnami Redmine Stake->Use Bitnami Redmine Stake -> Run as Administrator

Then apply below command:
 set RAILS_ENV=production  

This command tells that our Rail environment is Production. To create a plugin, this is initial step . DO NOT forget to execute this command.

Step 2:
To create a custom plugin in Ruby, you have to use command prompt. First, Create a plugin folder. Lets say, suppose I want to create Plugin called "Project Estimation Tracker"

For that I've used below code:
 ruby apps/redmine/htdocs/script/rails g redmine_plugin ProjectEstimationTracker  

Here,
g = Generate
ProjectEstimationTracker = Name of Your plugin.
apps/redmine/htdocs/script/rails = To create a plugin, rails script should be executed.


Step 3:
Now, we have to add our custom page to the plugin . For that, we need to create a controller first.

 ruby apps/redmine/htdocs/script/rails g redmine_plugin_controller ProjectEstimationTracker projecteffortstracking index  

Here,

ProjectEstimationTracker = Name of Your plugin.
projecteffortstracking = Name of Controller
index = action

As a result of above script, it will automatically create default view file in view folder .
In that view file ,we can apply our custom html code.


Step 4:
The most important step for creating a plugin is to configure a route. To configure a route, Go to your plugin folder, then go to Config folder which has routes.config file.
Example: For my plugin routes.config file path is:
"C:\Bitnami\redmine-2.5.1-1\apps\redmine\htdocs\plugins\project_efforts_tracking\config\routes.config"

Edit routes.config file.
 RedmineApp::Application.routes.draw do   
 get 'ProjectEffort', :to => 'projecteffortstracking#index'  
 end  

Here,
get = Name of Plugin (recommended) or any other name which you would like to show in URL.
Example: Here URL will b: http://localhost:90/redmine/ProjectEffort
To => Name of Controller

Step 5:
Final step is to change configuration setting. Go to your plugin folder which have init.rb file.
Example: project_efforts_tracking→ init.rb

Then apply following code:
 Redmine::Plugin.register :project_efforts_tracking do  
  name 'Project Efforts Tracking plugin'  
  author 'Author name'  
  description 'This is a plugin for Redmine'  
  version '0.0.1'  
  url 'http://example.com/path/to/plugin'  
  author_url 'http://example.com/about'  
  menu :admin_menu, :projecteffortstracking, { :controller => 'projecteffortstracking', :action => 'index' }, :caption => 'Project Estimation Tracker'  
 end  

Here, we can mention where we would like to show the plugin, e.g. In Admin menu or Top menu etc.
For different type of menus visit : Redmine Menus
Now, your plugin is ready to use. Restart your server using Bitnami Redmine Stack.
 
If you have any questions you can reach out our SharePoint Consulting team here.