Author Archives: Senthamil

About Senthamil

I am a Microsoft MVP in Office Development

Group By SPListItem in SharePoint ListItemCollection

The CAML Query does not support group by object collection. The other workaround is to get the object and group by LINQ query

try
{
using (SPSite oSiteCollection = new SPSite(webUrl))
{
using (SPWeb oWebsite = oSiteCollection.OpenWeb())
{
SPList listName = oWebsite.Lists.TryGetList("MCS");
if (listName != null)
{
SPView myView = listName.Views["GroupBy"];
if (myView != null)
{
SPQuery qry = new SPQuery(myView);
SPListItemCollection myItemsList = listName.GetItems(qry);
var results = (from SPListItem item in myItemsList
group item by item["Group Number"]
into grp
select new
{
GroupID = grp.Key.ToString(),
ListItemsData = grp
});
foreach (var listData in results)
{
MessageBox.Show(listData.GroupID);
foreach (var item in listData.ListItemsData)
{
MessageBox.Show(item["XmlText"].ToString());
}
}
}
}
}

}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}

jQuery Accordion based Announcement Web Part

A simple announcement web part based on the accordion. The theme can be changed from the below site http://jqueryui.com/themeroller/. To change the theme just download the theme package from the above link and past the css to the layouts folder. Make sure the List which you connecting to the webpart has the same name of the column as in the below picture

image

The web part takes the below properties

image

Sample screenshot below

image

image

The wsp and the source are in the folder http://sdrv.ms/13OhOjj

Cache cluster is down, restart the cache cluster and Retry Error–SharePoint 2013

The windows event viewer might have many critical error from Distributed Cache Service. This service will try every 5 min and will throw critical error on the event viewer. The error message may be “Cache cluster is down, restart the cache cluster and Retry “ The root cause of this error may be various reasons. The distributed cache service is very fragile and needs more memory from the server side. Before we Start the Distributed Cache Service please read the article http://technet.microsoft.com/library/jj219572(office.15).aspx The article tells you that we need to enable some firewall ports in the server, especially the application server where the service is running, If multiple servers are running need to start the first server which should have incoming ports to be enabled.

I followed the below procedure, it solved for me, not sure it is the best solution

Remove the Distributed Cache Service using the Powershell command

Stop-SPDistributedCacheServiceInstance -Graceful

Remove-SPDistributedCacheServiceInstance 

Enable the firewall ports in the server for incoming ports for 22233,22234,22235,22236

From the Services in the windows need to Start the Remote Registry service, This service is needed to be running for the Distributed Cache Service. If planning for multiple servers make sure in all servers the service is running

Restart the AppFabric Caching Service from the windows service.

Add the Distributed Cache Service using the PowerShell command

Add-SPDistributedCacheServiceInstance

Through Central Administration->System Settings->Manage Services on server now can see the service. Start the Distributed Cache Service.

Install-SPRSService is not Recognized Error–SharePoint 2013 SQL 2012 Reporting Services

Sometime even after you install the SQL Reporting Services in SharePoint Mode to the Application Server you might get the below error

Install-SPRSService : The term ‘Install-SPRSService’ is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.

The reason might be the wrong version of SQL 2012 Reporting Services for SharePoint is installed. For SharePoint 2013 SQL 2012 SP1 is the requirement. In some cases if you have downloaded the SQL 2012 SP1 installer during November 2012 or before there is an issue with Microsoft Site that the SP1 will not be installed. So please download the latest SQL 2012 SP1 installer ISO and install the Reporting Service in SharePoint Mode.

http://support.microsoft.com/kb/2783963

Installing the SP1 with the latest ISO will solve the issue.

Central Admin Warning – InfoPath Forms Services forms cannot be filled out in a Web browser because no State Service connection is configured

After i complete my SharePoint 2013 setup the above mentioned warning appeared in the central administration. The solution is to run the below PS command to fix it

$serviceApp = New-SPStateServiceApplication -Name “State Service”
New-SPStateServiceDatabase -Name “StateService DB” -ServiceApplication $serviceApp
New-SPStateServiceApplicationProxy -Name “State Service” -ServiceApplication $serviceApp –DefaultProxyGroup

Move Search Components To Different Server in SharePoint 2013

SharePoint 2013 Search components are very scalable. Now you can move the components from the search to different servers which it is running. In SharePoint farm environment usually the Application Server will host all the Services including the Search Application services. The Search Application consists of several Search components like

  • Admin
  • Crawler
  • Content Processing
  • Analytics processing
  • Query Processing
  • Index

The movement of the components to another server is not possible from the Central Administration. The Search Topology does not provide any option to move. So these can be done using the PS commands.

Farm before movement

image

Farm After Movement

image

Below is the PS script for the activity.

$ssa = Get-SPEnterpriseSearchServiceApplication
$active = Get-SPEnterpriseSearchTopology -SearchApplication $ssa -Active
$clone = New-SPEnterpriseSearchTopology -SearchApplication $ssa -Clone –SearchTopology $active

$Srv2 = Get-SPEnterpriseSearchServiceInstance -Identity “Server Name Here
New-SPEnterpriseSearchQueryProcessingComponent -SearchTopology $clone -SearchServiceInstance $Srv2
New-SPEnterpriseSearchAnalyticsProcessingComponent -SearchTopology $clone -SearchServiceInstance $Srv2
New-SPEnterpriseSearchContentProcessingComponent -SearchTopology $clone -SearchServiceInstance $Srv2
New-SPEnterpriseSearchCrawlComponent -SearchTopology $clone -SearchServiceInstance $Srv2

Start-SPEnterpriseSearchServiceInstance -Identity $Srv2
Set-SPEnterpriseSearchTopology -Identity $clone
—————-
Clone again to remove original

$ssa = Get-SPEnterpriseSearchServiceApplication
$active = Get-SPEnterpriseSearchTopology -SearchApplication $ssa -Active
$clone = New-SPEnterpriseSearchTopology -SearchApplication $ssa -Clone –SearchTopology $active

Get-SPEnterpriseSearchComponent -SearchTopology $clone  — to get the component id for the subsequent commands

Remove-SPEnterpriseSearchComponent -Identity  COMPID -SearchTopology $clone -confirm:$false
Remove-SPEnterpriseSearchComponent -Identity COMPID -SearchTopology $clone -confirm:$false
Remove-SPEnterpriseSearchComponent -Identity COMPID -SearchTopology $clone -confirm:$false
Remove-SPEnterpriseSearchComponent -Identity COMPID -SearchTopology $clone -confirm:$false

Set-SPEnterpriseSearchTopology -Identity $clone

—–

Replace the Server Name Here with the server you want to move

COMPID i the old component id of the search component to be removed. You can get all component running on the server by using the command

Get-SPEnterpriseSearchComponent -SearchTopology $clone

I followed the post from this blog http://stevemannspath.blogspot.ch/2013/03/sharepoint-2013-search-moving-query.html

How to find Concurrent Users for SharePoint and ASP.NET using IIS Logs

This post is about finding the number of concurrent users. The concurrent users helps to do the capacity planning for SharePoint. Below are the main attributes required to do the capacity planning for SharePoint 2010/2013.

Average daily RPS

Average RPS at peak time

Total number of unique users per day

Average daily concurrent users

Peak concurrent users at peak time

Total number of requests per day

 

One of the main attribute for the planning is the Concurrent users. Below are the steps required to find the concurrent users

  1. Enable the IIS Logs in the server
  2. Collect the logs for particular period
  3. Install Log Parser Tool
  4. Analyse the Log file using the Log Parser.
  5. Finally can plot a graph or use the data in any form

You can download the load.txt and the bat file in the skydrive

Step 1:

Before collecting the IIS log files, make sure the below attributes are enabled in the IIS. These attributes are the key to do analysis. The below fields can be enabled in IIS.

Field

Column name

Date

date

Time

time

Client IP Address

c-ip

User Name

cs-username

Method

cs-method

URI Stem

cs-uri-stem

Protocol Status

sc-status

Protocol SubStatus

sc-substatus

Bytes Sent

sc-bytes

Bytes Received

cs-bytes

Time Taken

time-taken

User Agent

cs-user agent

IIS Log

Step 2:

After enabling the necessary attributes, allow the application to be used for few days. After few days collect the IIS log from the c:\inetpub\logs\logfiles\. The log files will be scattered with multiple directories. Usually it is better to set s particular directory to the IIS site for easy gathering of log files. The log files will be *.log extension. Usually the analysis is done out of the server. So collect all the files and copy it to the local desktop for analysis.

Step 3:

Install the log parser tool in the local laptop or the desktop where the log files are collected. The log parser can be downloaded from the below location

http://www.microsoft.com/en-us/download/details.aspx?id=24659

After installing the logparser better to add the logparser exe path to the environment Path folder. It will be easy to execute the logparser from any directory from command prompt if added the path to environment variable PATH.

Step 4:

To get the concurrent users follow the below steps

Copy the below text into the Load.txt file

select EXTRACT_FILENAME(LogFilename),LogRow,

date, time, cs-method, cs-uri-stem, cs-username, c-ip, cs(User-Agent), cs-host, sc-status, sc-substatus, sc-bytes, cs-bytes, time-taken,

add(
   add(
      mul(3600,to_int(to_string(to_localtime(to_timestamp(date,time)),’hh’))),
      mul(60,to_int(to_string(to_localtime(to_timestamp(date,time)),’mm’)))
   ),
   to_int(to_string(to_localtime(to_timestamp(date,time)),’ss’))
) as secs,

to_int(to_string(to_localtime(to_timestamp(date,time)),’yy’)) as yy,
to_int(to_string(to_localtime(to_timestamp(date,time)),’MM’)) as mo,
to_int(to_string(to_localtime(to_timestamp(date,time)),’dd’)) as dd,

to_int(to_string(to_localtime(to_timestamp(date,time)),’hh’)) as hh,
to_int(to_string(to_localtime(to_timestamp(date,time)),’mm’)) as mi,
to_int(to_string(to_localtime(to_timestamp(date,time)),’ss’)) as ss,

to_lowercase(EXTRACT_PATH(cs-uri-stem)) as fpath, 
to_lowercase(EXTRACT_FILENAME(cs-uri-stem)) as fname, 
to_lowercase(EXTRACT_EXTENSION(cs-uri-stem)) as fext

from *.log

where sc-status401

After copying the above text into load.txt run the log parser. Assume that all the IIS files are in the c:\iislogs\. Run the command prompt in admin mode. use the commnd to navigate to the log folder say cd c:\iislogs.

In my case the log files are in d:\log files\april 2013

image

image

Run the below command in the command prompt to create a bigo.csv in the log folder which later can be used for analysis.

logparser -i:IISW3C file:load.txt -o:csv -q >bigo.csv

The above command will create a bigo.csv in the folder. I have created a bat file based on the blog mentioned here http://blogs.msdn.com/b/markarend/archive/2012/02/24/measuring-concurrent-site-users.aspx

Create a Batch file named ConCurrentUser.bat and add the below script into it

SET inputCSV=%1
if ‘%inputCSV%’==” GOTO USAGE

REM outputCSV has no extension, it’s added later
SET outputCSV=UserConcurrency

SET concurrencyPeriod=%2
if ‘%concurrencyPeriod%’==” GOTO USAGE

SET concurrencyField=%3
if ‘%concurrencyField%’==” SET concurrencyField=c-ip

REM Set a filter to match requests that should be excluded
REM %%Service%% matches our service accounts (like search), exclude them
REM …if you don’t want a filter, use SET filter=0 IS NULL
SET filter=(cs-username LIKE ‘%%Service%%’)

echo.
echo Counting Concurrent Users
echo inputCSV         : %inputCSV%
echo outputCSV        : %outputCSV%.csv
echo concurrencyField : %concurrencyField%
echo concurrencyPeriod: %concurrencyPeriod% seconds

echo.
echo First stage, quantizing to %concurrencyPeriod% seconds…
logparser -i:CSV -o:CSV “SELECT DISTINCT %concurrencyField%, date, QUANTIZE(TO_TIMESTAMP(time,’hx:mx:sx’), %concurrencyPeriod%) AS Period, COUNT(*) as Requests INTO temp1-%outputCSV%.csv FROM %inputCSV% WHERE %concurrencyField% IS NOT NULL AND NOT %filter% Group By  Date, Period, %concurrencyField%”

echo.
echo Second stage, grouping…
logparser -i:CSV -o:CSV “SELECT date, to_string(to_timestamp(Period,’hx:mx:sx’),’hh’) as Hour, Period, count(%concurrencyField%) as UserCount, sum(Requests) as RequestCount INTO temp2-%outputCSV%.csv From temp1-%outputCSV%.csv Group by date, Period”
logparser -i:CSV -o:CSV “SELECT Hour, avg(UserCount) as Concurrent-Users(q%concurrencyPeriod%), sum(RequestCount) as Total-Requests(q%concurrencyPeriod%) INTO %outputCSV%-%concurrencyPeriod%.csv From temp2-%outputCSV%.csv GROUP BY Hour ORDER BY Hour”

GOTO DONE

:USAGE

echo.
echo # Usage:
echo #
echo # ConcurrentUsers inputCSV seconds [fieldname]
echo #
echo # inputCSV : csv file (or *.csv) of log entries with fields: date, time, c-ip or [fieldname], other
echo # seconds  : concurrency quantization level.  Typical values 300 and 600 seconds
echo # fieldname: field to evaluate for concurrency.  Typical values c-ip (default) and cs-username
echo #
echo # Example  : ConcurrentUsers BigO.csv 300
echo.

:DONE

Now run the below command in the command prompt to get the concurrent users for multiple periods

ConcurrentUser BigO.csv 300

ConcurrentUser BigO.csv 1200

ConcurrentUser BigO.csv 3600

Using the result from the above command we can plot the graph in the excel.

Graph

Certification Failed, Charm Settings Privacy Policy not Added

First time when i upload the package for certification on my windows store it was failed. When i check the failed details it was stated that Privacy Policy was not added to the Charm Settings in the App. I thought it was an URL entry in the App Description page in the store release. I added the URL in the App Description section for the release and saved it again. It was again failed for certification with the same message. Then when i further analyse, i found out that we need to add the Command Privacy Policy to the Settings Charm in the App manually using the C# code.

In the App.Xaml.cs add the following lines of code

Declaration

using Windows.UI.ApplicationSettings;
using Windows.UI.Popups;

In the OnLaunched Event add the below code.

protected override async void OnLaunched(LaunchActivatedEventArgs args)
        {
            /// Above code for launching and at the last add the code below
            try
            {
                SettingsPane.GetForCurrentView().CommandsRequested += App_CommandsRequested;
            }
            catch (Exception)
            {
            }
        }
Further down add the below code
void App_CommandsRequested(SettingsPane sender, SettingsPaneCommandsRequestedEventArgs args)
        {
            args.Request.ApplicationCommands.Add(new SettingsCommand("privacypolicy", "Privacy Policy", OpenPrivacy));
        }

        private async void OpenPrivacy(IUICommand cmd)
        {
            Uri privacyUrl = new Uri(www.bing.com/privacy(your url here));
            await Windows.System.Launcher.LaunchUriAsync(privacyUrl);
        }

Thats it, now when you open the App and go to settings charm you can see privacy policy option. When you select the Privacy Policy it will open the IE to navigate the URL.