Category Archives: SharePoint

CSOM, PowerShell, SharePoint Search & SPFx

jQuery Accordion based Announcement Web Part

A simple announcement web part based on the accordion. The theme can be changed from the below site To change the theme just download the theme package from the above link and past the css to the layouts folder. Make sure the List which you connecting to the webpart has the same name of the column as in the below picture


The web part takes the below properties


Sample screenshot below



The wsp and the source are in the folder

Cache cluster is down, restart the cache cluster and Retry Error–SharePoint 2013

The windows event viewer might have many critical error from Distributed Cache Service. This service will try every 5 min and will throw critical error on the event viewer. The error message may be “Cache cluster is down, restart the cache cluster and Retry “ The root cause of this error may be various reasons. The distributed cache service is very fragile and needs more memory from the server side. Before we Start the Distributed Cache Service please read the article The article tells you that we need to enable some firewall ports in the server, especially the application server where the service is running, If multiple servers are running need to start the first server which should have incoming ports to be enabled.

I followed the below procedure, it solved for me, not sure it is the best solution

Remove the Distributed Cache Service using the Powershell command

Stop-SPDistributedCacheServiceInstance -Graceful


Enable the firewall ports in the server for incoming ports for 22233,22234,22235,22236

From the Services in the windows need to Start the Remote Registry service, This service is needed to be running for the Distributed Cache Service. If planning for multiple servers make sure in all servers the service is running

Restart the AppFabric Caching Service from the windows service.

Add the Distributed Cache Service using the PowerShell command


Through Central Administration->System Settings->Manage Services on server now can see the service. Start the Distributed Cache Service.

Install-SPRSService is not Recognized Error–SharePoint 2013 SQL 2012 Reporting Services

Sometime even after you install the SQL Reporting Services in SharePoint Mode to the Application Server you might get the below error

Install-SPRSService : The term ‘Install-SPRSService’ is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.

The reason might be the wrong version of SQL 2012 Reporting Services for SharePoint is installed. For SharePoint 2013 SQL 2012 SP1 is the requirement. In some cases if you have downloaded the SQL 2012 SP1 installer during November 2012 or before there is an issue with Microsoft Site that the SP1 will not be installed. So please download the latest SQL 2012 SP1 installer ISO and install the Reporting Service in SharePoint Mode.

Installing the SP1 with the latest ISO will solve the issue.

Central Admin Warning – InfoPath Forms Services forms cannot be filled out in a Web browser because no State Service connection is configured

After i complete my SharePoint 2013 setup the above mentioned warning appeared in the central administration. The solution is to run the below PS command to fix it

$serviceApp = New-SPStateServiceApplication -Name “State Service”
New-SPStateServiceDatabase -Name “StateService DB” -ServiceApplication $serviceApp
New-SPStateServiceApplicationProxy -Name “State Service” -ServiceApplication $serviceApp –DefaultProxyGroup

Move Search Components To Different Server in SharePoint 2013

SharePoint 2013 Search components are very scalable. Now you can move the components from the search to different servers which it is running. In SharePoint farm environment usually the Application Server will host all the Services including the Search Application services. The Search Application consists of several Search components like

  • Admin
  • Crawler
  • Content Processing
  • Analytics processing
  • Query Processing
  • Index

The movement of the components to another server is not possible from the Central Administration. The Search Topology does not provide any option to move. So these can be done using the PS commands.

Farm before movement


Farm After Movement


Below is the PS script for the activity.

$ssa = Get-SPEnterpriseSearchServiceApplication
$active = Get-SPEnterpriseSearchTopology -SearchApplication $ssa -Active
$clone = New-SPEnterpriseSearchTopology -SearchApplication $ssa -Clone –SearchTopology $active

$Srv2 = Get-SPEnterpriseSearchServiceInstance -Identity “Server Name Here
New-SPEnterpriseSearchQueryProcessingComponent -SearchTopology $clone -SearchServiceInstance $Srv2
New-SPEnterpriseSearchAnalyticsProcessingComponent -SearchTopology $clone -SearchServiceInstance $Srv2
New-SPEnterpriseSearchContentProcessingComponent -SearchTopology $clone -SearchServiceInstance $Srv2
New-SPEnterpriseSearchCrawlComponent -SearchTopology $clone -SearchServiceInstance $Srv2

Start-SPEnterpriseSearchServiceInstance -Identity $Srv2
Set-SPEnterpriseSearchTopology -Identity $clone
Clone again to remove original

$ssa = Get-SPEnterpriseSearchServiceApplication
$active = Get-SPEnterpriseSearchTopology -SearchApplication $ssa -Active
$clone = New-SPEnterpriseSearchTopology -SearchApplication $ssa -Clone –SearchTopology $active

Get-SPEnterpriseSearchComponent -SearchTopology $clone  — to get the component id for the subsequent commands

Remove-SPEnterpriseSearchComponent -Identity  COMPID -SearchTopology $clone -confirm:$false
Remove-SPEnterpriseSearchComponent -Identity COMPID -SearchTopology $clone -confirm:$false
Remove-SPEnterpriseSearchComponent -Identity COMPID -SearchTopology $clone -confirm:$false
Remove-SPEnterpriseSearchComponent -Identity COMPID -SearchTopology $clone -confirm:$false

Set-SPEnterpriseSearchTopology -Identity $clone


Replace the Server Name Here with the server you want to move

COMPID i the old component id of the search component to be removed. You can get all component running on the server by using the command

Get-SPEnterpriseSearchComponent -SearchTopology $clone

I followed the post from this blog

How to find Concurrent Users for SharePoint and ASP.NET using IIS Logs

This post is about finding the number of concurrent users. The concurrent users helps to do the capacity planning for SharePoint. Below are the main attributes required to do the capacity planning for SharePoint 2010/2013.

Average daily RPS

Average RPS at peak time

Total number of unique users per day

Average daily concurrent users

Peak concurrent users at peak time

Total number of requests per day


One of the main attribute for the planning is the Concurrent users. Below are the steps required to find the concurrent users

  1. Enable the IIS Logs in the server
  2. Collect the logs for particular period
  3. Install Log Parser Tool
  4. Analyse the Log file using the Log Parser.
  5. Finally can plot a graph or use the data in any form

You can download the load.txt and the bat file in the skydrive

Step 1:

Before collecting the IIS log files, make sure the below attributes are enabled in the IIS. These attributes are the key to do analysis. The below fields can be enabled in IIS.


Column name





Client IP Address


User Name




URI Stem


Protocol Status


Protocol SubStatus


Bytes Sent


Bytes Received


Time Taken


User Agent

cs-user agent


Step 2:

After enabling the necessary attributes, allow the application to be used for few days. After few days collect the IIS log from the c:\inetpub\logs\logfiles\. The log files will be scattered with multiple directories. Usually it is better to set s particular directory to the IIS site for easy gathering of log files. The log files will be *.log extension. Usually the analysis is done out of the server. So collect all the files and copy it to the local desktop for analysis.

Step 3:

Install the log parser tool in the local laptop or the desktop where the log files are collected. The log parser can be downloaded from the below location

After installing the logparser better to add the logparser exe path to the environment Path folder. It will be easy to execute the logparser from any directory from command prompt if added the path to environment variable PATH.

Step 4:

To get the concurrent users follow the below steps

Copy the below text into the Load.txt file

select EXTRACT_FILENAME(LogFilename),LogRow,

date, time, cs-method, cs-uri-stem, cs-username, c-ip, cs(User-Agent), cs-host, sc-status, sc-substatus, sc-bytes, cs-bytes, time-taken,

) as secs,

to_int(to_string(to_localtime(to_timestamp(date,time)),’yy’)) as yy,
to_int(to_string(to_localtime(to_timestamp(date,time)),’MM’)) as mo,
to_int(to_string(to_localtime(to_timestamp(date,time)),’dd’)) as dd,

to_int(to_string(to_localtime(to_timestamp(date,time)),’hh’)) as hh,
to_int(to_string(to_localtime(to_timestamp(date,time)),’mm’)) as mi,
to_int(to_string(to_localtime(to_timestamp(date,time)),’ss’)) as ss,

to_lowercase(EXTRACT_PATH(cs-uri-stem)) as fpath, 
to_lowercase(EXTRACT_FILENAME(cs-uri-stem)) as fname, 
to_lowercase(EXTRACT_EXTENSION(cs-uri-stem)) as fext

from *.log

where sc-status401

After copying the above text into load.txt run the log parser. Assume that all the IIS files are in the c:\iislogs\. Run the command prompt in admin mode. use the commnd to navigate to the log folder say cd c:\iislogs.

In my case the log files are in d:\log files\april 2013



Run the below command in the command prompt to create a bigo.csv in the log folder which later can be used for analysis.

logparser -i:IISW3C file:load.txt -o:csv -q >bigo.csv

The above command will create a bigo.csv in the folder. I have created a bat file based on the blog mentioned here

Create a Batch file named ConCurrentUser.bat and add the below script into it

SET inputCSV=%1
if ‘%inputCSV%’==” GOTO USAGE

REM outputCSV has no extension, it’s added later
SET outputCSV=UserConcurrency

SET concurrencyPeriod=%2
if ‘%concurrencyPeriod%’==” GOTO USAGE

SET concurrencyField=%3
if ‘%concurrencyField%’==” SET concurrencyField=c-ip

REM Set a filter to match requests that should be excluded
REM %%Service%% matches our service accounts (like search), exclude them
REM …if you don’t want a filter, use SET filter=0 IS NULL
SET filter=(cs-username LIKE ‘%%Service%%’)

echo Counting Concurrent Users
echo inputCSV         : %inputCSV%
echo outputCSV        : %outputCSV%.csv
echo concurrencyField : %concurrencyField%
echo concurrencyPeriod: %concurrencyPeriod% seconds

echo First stage, quantizing to %concurrencyPeriod% seconds…
logparser -i:CSV -o:CSV “SELECT DISTINCT %concurrencyField%, date, QUANTIZE(TO_TIMESTAMP(time,’hx:mx:sx’), %concurrencyPeriod%) AS Period, COUNT(*) as Requests INTO temp1-%outputCSV%.csv FROM %inputCSV% WHERE %concurrencyField% IS NOT NULL AND NOT %filter% Group By  Date, Period, %concurrencyField%”

echo Second stage, grouping…
logparser -i:CSV -o:CSV “SELECT date, to_string(to_timestamp(Period,’hx:mx:sx’),’hh’) as Hour, Period, count(%concurrencyField%) as UserCount, sum(Requests) as RequestCount INTO temp2-%outputCSV%.csv From temp1-%outputCSV%.csv Group by date, Period”
logparser -i:CSV -o:CSV “SELECT Hour, avg(UserCount) as Concurrent-Users(q%concurrencyPeriod%), sum(RequestCount) as Total-Requests(q%concurrencyPeriod%) INTO %outputCSV%-%concurrencyPeriod%.csv From temp2-%outputCSV%.csv GROUP BY Hour ORDER BY Hour”



echo # Usage:
echo #
echo # ConcurrentUsers inputCSV seconds [fieldname]
echo #
echo # inputCSV : csv file (or *.csv) of log entries with fields: date, time, c-ip or [fieldname], other
echo # seconds  : concurrency quantization level.  Typical values 300 and 600 seconds
echo # fieldname: field to evaluate for concurrency.  Typical values c-ip (default) and cs-username
echo #
echo # Example  : ConcurrentUsers BigO.csv 300


Now run the below command in the command prompt to get the concurrent users for multiple periods

ConcurrentUser BigO.csv 300

ConcurrentUser BigO.csv 1200

ConcurrentUser BigO.csv 3600

Using the result from the above command we can plot the graph in the excel.


Enable Claims Based on existing Web Application with Classic Mode (Access denied Error)

Recently i encountered a situation in which i need to enable the claim based authentication for the existing web application which has windows authentication. From the central administration there is no way or option to enable it. This can be achieved only by the powershell script.

$WebAppName = “http://yourWebAppUrl”
$wa = get-SPWebApplication $WebAppName
$wa.UseClaimsAuthentication = $true

The above code will enable the claims based authentication to the existing web application. But once enabled when try to Login the user might get Access Denied Error. This is because the users are stored in the different format in the claims based authentication.

Need to execute the below commands to migrate all the users from existing windows user to the claim based.

Warning: Once migrated it will change the user information in all the content databases. This change is permanent.

$account = “yourDomain\yourUser”
$account = (New-SPClaimsPrincipal -identity $account -identitytype 1).ToEncodedString()
$wa = get-SPWebApplication $WebAppName
$zp = $wa.ZonePolicies(“Default”)
$p = $zp.Add($account,”PSPolicy”)


Revert Back from Claims Authentication to Windows.

$WebAppName = “http://yourWebAppUrl”
$wa = get-SPWebApplication $WebAppName
$wa.UseClaimsAuthentication = $false

The above code will only revert back the web application. But the users are not migrated. Need to convert back to the windows user. But when i tried MigrateUsers($false) i got the below error


So we cannot use that method to revert the users. So i followed the approach given in the blog below

Basically use the stsadm command and strip off the unwanted text before the user name in the site collection

Below is the code taken from that blog.

public Program(string url)
    using (SPSite site = new SPSite(url))
        using (SPWeb web = site.RootWeb)
            foreach (SPUser user in web.AllUsers)
                string username = GetClaimBasedUserName(user);
                if (!username.Equals(string.Empty))
                    Console.Write(“Migrating {0} to {1}…”, user.LoginName, username);
                        SPFarm Farm = SPFarm.Local;
                        Farm.MigrateUserAccount(user.LoginName, username, false);
                    catch (Exception ex)
private string GetClaimBasedUserName(SPUser user)
    string username = string.Empty;
        if (user.IsDomainGroup)
            if (user.LoginName.StartsWith(“c:0+.w|”))
                username = user.Name;
            if (user.LoginName.StartsWith(“i:0#.w|”))
                username = user.LoginName.Substring(7);
    return username;

LINQ Query to SharePoint List using Contains with Array

SPMetal tool folder location

C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\BIN\

Command to create a Entity class for the site

SPMetal /web:http://<servername>/site /namespace:<namesp> /code:filename.cs

ex: SPMetal /web:http://senthamil/sites/sample /namespace:Sample /code:Sample.cs

The sample.cs file will be generated with the Lists and the content type information. To compare with the array of document names in the document library, a LINQ query can be used to filter the document.

List<string> docNames= new List<string>();

List<string> docExt= new List<string>();

docNames = {“Contract”, “Agreement”,”Case Study”, “Circular”};

docExt = {“.doc”, “.docx”,”.xls”, “.xlsx”};

Sample.SampleDataContext ctx = new Sample.SampleDataContext(SPContext.Current.Web.Url);

////The LINQ query to filter the documents with the specific name and extension

var result = from docs in ctx.MyDocuments.ScopeToFolder(“”,true) /// Get all the documents irrespective of folders

              where docnames.Contains(.docs.Name) && docExt.Contains(docs.Name)

              select docs;

The above code will filter the document based on name and extention from the MyDocuments document library in the Sample sharepoint site. With just single easy to understand query able to filter documents based on dynamic criteria