Best Practices for SharePoint Application Development

Guys,I got some MSDN document on best practices list for SharePoint based application development  and I recommend you can follow when you code using SharePoint Object Model.This is just to help you to ensure that the solutions that you develope for your SharePoint environment provide the intended benefits without exposing your enterprise to unnecessary risk.

Probably this one from Microsoft (Click the link)  is also too good for improving performance  for asp.net apps running under SharePoint.

This is not the final list and these are my own experiences I had with the above said MSDN document,may be if you found more number of practices,I recommend you to please share here..

Look at here for Reporting Services Installation and Configuration.

Best Coding Techniques To Improve Performance for SharePoint Applications

Introduction

As more developers write custom code by using the SharePoint Object Model, they encounter common issues that can affect application performance.

The following areas reflect the main issues encountered by developers as they write custom code by using the SharePoint object model:

§  Disposing of SharePoint objects

§  Caching data and objects

§  Writing code that is scalable

 

Disposing of SharePoint Objects

One of the biggest issues that a developer can encounter when writing custom code by using the SharePoint object model is not properly disposing of the SharePoint objects.

In the SharePoint object model, the Microsoft.SharePoint.SPSite and Microsoft.SharePoint.SPWeb objects are created in managed code as a small wrapper (approximately 2 KB in size). This wrapper then creates unmanaged objects, which can average approximately 1–2 MB in size. If your code resembles the following code example, and if you assume that the SPWeb.Webs collection has 10 subsites, a total of 10 items are created, each with an average of 2 MB of memory (for a total of 20 MB).

 

public void GetNavigationInfo()

{

   SPWeb oSPWeb = SPContext.Web;

 

    // .. Get information oSPWeb for navigation ..

 

   foreach(SPWeb oSubWeb in oSPWeb.GetSubWebsForCurrentUser())

   {

      // .. Add subweb information for navigation ..

    }

}

 

Under lighter user loads, the previous scenario might not present an issue; however, as the user load increases you can begin to experience poor performance, user time-outs, unexpected errors, and in some cases, a failure of the SharePoint application or application pool. Using the previous example of 10 objects created each time a user hits the page, you can see how the memory use rises very quickly.

 For example, the following table shows how much memory is allocated as users hit the system within a relatively short amount of time.

Table 1. Best and worst case memory usage as number of user’s increases

Users

Best Case

Worst Case

10

100 MB

200 MB

50

500 MB

1000 MB

100

1000 MB

2000 MB

250

2500 MB

5000 MB

 

As the number of users hitting the system increases, this situation worsens. As the memory use increases, the system begins to behave strangely, including performing poorly and possibly failing until the application pool is recycled or an iisreset command is issued.

 

How to Identify the Issue

You can easily identify this issue by asking the following questions:

 Does your application pool recycle frequently, especially under heavy loads?

This assumes that the application pool is set to recycle when a memory threshold is reached. The memory threshold should be between 800 MB and 1.5 GB (assuming you have 2 GB or more of RAM). Setting the recycle of the application pool to occur closer to 1 GB gives the best results, but you should experiment to see what settings work best for your environment. If the recycle setting is too low, you will experience performance issues because of frequent application pool recycles. If the setting is too high, your system will begin to experience performance problems because of page swapping, memory fragmentation, and other issues.

 Does your system perform poorly, especially under heavy loads?

As memory usage begins to increase the system must compensate, for example, by paging memory and handling memory fragmentation.

Does your system crash or do users experience unexpected errors such as timeouts or page-not-available errors, especially under heavy loads?

Again, when memory utilization gets high or fragmented, some functions can fail because they cannot allocate memory for other operations. In many cases, the code does not properly handle the “out of memory” exception, which leads to false or misleading errors.

Does your system use custom Web Parts or use any third-party Web Parts?

Most Web Part developers are not aware that they must dispose of SharePoint objects and why. They assume that garbage collection performs this function automatically, but that is not true in all cases.

If you answer “yes” to number 4, and to one or more of the other questions, there is about a 90 percent chance that your custom code is not disposing of items properly. As you can see from Table 1, you need only one heavily used page that is not properly disposing of items to cause some problems. Following is an example of how to fix the previous GetNavigationInfo function.

 

public void GetNavigationInfo()

{

   SPWeb oSPWeb = SPContext.Web;

 

   foreach(SPWeb oSubWeb in oSPWeb.GetSubWebsForCurrentUser()))

   {

      // .. Add subweb information for navigation ..

      oSubWeb.Dispose();

   }

}

 In the foreach loop, new SPWeb objects are created each time they are retrieved from the collection. Most developers assume that the objects are cleaned up when they go out of scope, but this does not happen when you use the SharePoint object model.

You must also be aware of other issues that can cause problems. For example, after calling the RootWeb property on a site, you must dispose of the SPWeb object it creates by calling the RootWeb.Dispose() method.

Note: 

Do not dispose of any item returned directly from the Microsoft.SharePoint.SPContext.Site or Microsoft.SharePoint.SPContext.Web property. Doing so can cause the SharePoint system to become unstable and can cause application failure.

 Caching Data and Objects

Many developers are starting to use the Microsoft .NET Framework caching objects (for example, System.Web.Caching.Cache) to help make better use of memory and increase overall system performance. But, many objects are not “thread safe” and caching those objects can lead to application crashes and unexpected or unrelated user errors.

Caching SharePoint Objects That Are Not Thread Safe

Developers are trying to increase performance and memory usage by caching SPListItemCollection objects that are returned from queries. In general, this is a good practice but the SPListItemCollection object contains an embedded SPWeb object that is not thread safe and should not be cached. For example, assume the SPListItemCollection object is cached in thread A. Then, as other threads try to read it, the application can fail or behave strangely because the object is not thread safe.

 Not Using Thread Synchronization

Some developers are not aware that they are running in a multi-threaded environment (by default, Internet Information Services is multi-threaded) or how to manage that environment. The following code example shows how some developers are caching Microsoft.SharePoint.SPListItemCollection objects.

 

public void CacheData()

{

   SPListItemCollection oListItems;

 

   oListItems = (SPListItemCollection)Cache[“ListItemCacheName”];

   if(oListItems == null)

   {

      oListItems = DoQueryToReturnItems();

      Cache.Add(“ListItemCacheName”, oListItems, ..);

   }

}

 In the previous code example, the problem is that if the query to get the data takes 10 seconds, you could have many users hitting that page at the same time, all running the same query and trying to update the same cache object at the same time. This can cause performance issues because the same query might be running 10, 50, or 100 times and can cause crashes because multiple threads are trying to update the same object at the same time, especially on multi-process, hyper-threaded computers. To fix this, you must change the code as follows.

public void CacheData()

{

   SPListItemCollection oListItems;

 

   lock(this)

   {

      oListItems = (SPListItemCollection)Cache[“ListItemCacheName”];

      if(oListItems == null)

      {

         oListItems = DoQueryToReturnItems();

         Cache.Add(“ListItemCacheName”, oListItems, ..);

     }

   }

}

Note: 

It is possible to increase performance slightly by placing the lock inside the if(oListItems == null) code block. When you do this, you do not need to suspend all threads while checking to see if the data is already cached. Depending on how long it takes the query to return the data, there is still the possibility that more than one user might be running the query at the same time. This is especially true if you are running on multiprocessor computers. Remember that the more processors running and the longer the query takes to run, the more likely putting the lock in the if() code block will cause problems. While the previous example might cause a slight performance hit, it is the only way to ensure that you will not have multiple queries running at the same time.

 This code suspends all other threads in a critical section running in Internet Information Services, and prevents other threads from accessing the cached object until it is completely built.

The previous example addresses the thread synchronization issue; however, it is still not correct because it is caching an object that is not thread safe. To address thread safety, you could cache a DataTable object that is created from the SPListItemCollection object. For example, you would modify the previous example as follows.

public void CacheData()

{

   DataTable oDataTable;

   SPListItemCollection oListItems;

 

   lock(this)

   {

      oDataTable = (DataTable)Cache[“ListItemCacheName”];

      if(oDataTable == null)

      {

         oListItems = DoQueryToReturnItems();

         oDataTable = oListItems.GetDataTable();

         Cache.Add(“ListItemCacheName”, oDataTable, ..);

      }

   }

}

 

 

Your code then gets the data from the DataTable object. For more information and examples of using the DataTable object, and other good ideas for developing SharePoint applications, see Tips and Tricks for Developing with Windows SharePoint Services.

Writing Code That Is Scalable

Some developers are not aware that they need to write their code to be scalable for handling multiple users at the same time. A good example of this is creating custom navigation information for all sites and subsites on each page or as part of a master page. For example, if you have a SharePoint site on a corporate intranet and each department has its own site with many subsites, your code might resemble the following.

public void GetNavigationInfoForAllSitesAndWebs()

{

   foreach(SPSite oSPSite in SPContext.Current.Site.WebApplication.Sites)

   {

      using(SPWeb oSPWeb  = oSPSite.RootWeb)

      {

         AddAllWebs(oSPWeb );

      }

   }

}

 

public void AddAllWebs(SPWeb oSPWeb)

{

   foreach(SPWeb oSubWeb in oSPWeb.Webs)

   {

      //.. Code to add items ..

      AddAllWebs(oSubWeb);

      oSubWeb.Dispose();

   }

}

 

While the previous code disposes of objects properly, it still causes problems because the code is going through the same lists over and over. For example, if you have 10 site collections and an average of 20 sites or subsites per site collection, you would iterate through the same code 200 times. For a small number of users this might not cause bad performance. But, as you add more users to the system, the problem gets worse. Table 2 shows this.


Table 2. Iterations increase as the number of users increase

Users

Iterations

10

2000

50

10000

100

200000

250

500000

 

The code executes for each user that hits the system, but the data remains the same for everyone. The impact of this can vary depending on what the code is doing. In some cases, repeating code over and over might not cause a performance problem; however, in the previous example the system has to create a COM object (SPSite or SPWeb objects are created when retrieved from their collections), retrieve data from the object, and then dispose of it for each item in the collection. This creates a lot of performance overhead.

How can you make this code more scalable or fine-tuned for a multiple user environment? This can be a hard question to answer, and it depends on what the application is designed to do. There are a few things that you need to take into consideration when asking how to make code more scalable:

·         Is the data static (seldom changes), somewhat static (changes occasionally), or dynamic (constantly changing)?

·         Is the data the same for all users, or does it change? For example, does it change depending on the user who is logged on, the part of the site being accessed, or the time of year (seasonal information)?

·         Is the data easily accessible or does it require a long time to return the data? For example, is it returning from a long-running SQL query or from remote databases that can have some network latency in the data transfers?

·         Is the data public or does it require a higher level of security?

·         What is the size of the data?

·         Is the SharePoint site on a single server or on a server farm?

 

Depending on how you answer the previous questions, there are several different ways you can make your code more scalable and handle multiple users.

Caching Raw Data

You can cache your data by using the System.Web.Caching.Cache object. This object requires that you query the data one time and store it in the cache for access by other users.

If your data is static, you can set up the cache to load the data once and not expire until the application is restarted, or to load once a day to ensure data freshness. You can create the cache item when the application starts, when the first user session starts, or when the first user tries to access that data.

If your data is somewhat static, you can set up the cached items to expire within a certain number of seconds, minutes, or hours after it is created. This enables you to refresh your data within a timeframe that is acceptable to your users. Even if the data is cached for only 30 seconds, under heavy loads you will still see an increase of performance because you are running the code only once every 30 seconds instead of multiple times a second for every user hitting the system.

Be sure to take into consideration the issues outlined previously in Caching Data and Objects.

 

 

Building Data Before Displaying It

Think about how your cached data will be used. If this data is used to make run-time decisions, putting it into a DataSet or DataTable object might be the best way to store it. You can then query those objects for the data to make run-time decisions. If the data is being used to display a list, table, or formatted page to the user, consider building a display object and storing that object in the cache. At run time, you need only to retrieve the object from the cache and call its render function to display its contents. You could also store the rendered output, but this can lead to security issues and the cached item could be quite large, causing a lot of page swapping or memory fragmentation.

Caching for a Single Server or Server Farm

Depending on how your SharePoint site is set up, you might have to address some caching issues differently. If your data must be the same on all servers at all times, then you must ensure that the same data is cached on each server. One way to ensure this is to create the cached data and store it on a common server or in an SQL database. Again, you must consider how long it takes to access the data and any security issues of the data being stored on a common server.

You could also create business-layer objects that cache data on a common sever, and then access that data by different interprocess communications available in networking objects or APIs.

Conclusion

To ensure that your SharePoint system performs at its best, you need to be able to answer the following questions about the code you write:

Ø  Does my code properly dispose of SharePoint objects?

Ø  Does my code cache objects properly?

Ø  Does my code cache the correct types of objects?

Ø  Does my code use thread synchronization when necessary?

Ø  Does my code work as efficiently for 1000 users as it does for 10 users?

If you consider these issues when you write your code, you will find that your SharePoint system runs more efficiently and that your users have a much better experience. You can also help to prevent unexpected failures and errors in your system.

Seven Pillars of SQL Server 2008

Last week Sunday my data center technology seminar at MS campus on SQL Server 2008 had went very well and I am really happy to see lots of companies are more interested and raise thier hands to migrate to SQL server 2008. So I thought to just share some inputs on SQL Server 2008 of my own experience and its really a great work by MS in this release,reducing lots of SQL Server DBA’s and developers work load. In one word I can say SQL Server 2008 is awesome and Microsoft Rocks. They had also launched new editions like WorkGroup and Web,Compact,Express Advanced.

There are several reasons you to choose SQL Server 2008 but I have found some seven reasons why you may require SQL Server 2008.Here is that…

1. Compression.  The release will offer row-level and page-level compression.  The compression mostly takes place on the metadata.  For instance, page compression will store common data for affected rows in a single place. 

The metadata storage for variable length fields is going to be completely crazy: they are pushing things into bits (instead of bytes).  For instance, length of the varchar will be stored in 3 bits. 
2. Query Plan freezing.  In some occasions SQL Server decides to change its plan (in response to data changes, etc…).

3. Delimited strings

At present in 2005, we need to pass delimited strings like below

exec spTestProcedure ‘udai,production;gilles,finance;keith,sales’

Then the stored proc needs to parse the string inside the procedure but now in In 2008, Microsoft had introduced a new type called Table Value Parameters (TVP). 

CREATE TYPE CustomDeptType AS TABLE (Name varchar(20), Dept varchar(20))
DECLARE @testCustom CustomDeptType
INSERT CustomDeptType SELECT ‘udai’, ‘production’
INSERT CustomDeptType SELECT ‘gilles’, ‘finance’
INSERT CustomDeptType SELECT ‘keith’, ‘sales’

and below is the way you can invoke store procedure
exec spTestProcedure @testCustom

4. Intellisense in the SQL Server Management Studio (SSMS).This has been previously possible in SQL Server 2000 and 2005 with  use of 3rd party add-ins like SQL Prompt ($195).
5.Auditing.
Excellent feature,Thanks Bill.SQL Server 2008 introduces automatic auditing, so here after we don’t need manually initiate the auditing abilities for our DB,may be you can also say no for Apex SQL Audit and your own tools.
6.Unary operator (C# syntax). 

You can use the C# unary operator like this SET @count += 2. so finally let a C# developer on the SQL team. 

7.Filtered Indexes.
This is awesome feature I personally felt its important . This allows you to create an index while specifying what rows are not to be in the index.  For example, index all rows where Status != null. To put in simple words this will get away of all the dead weight in the index which allows for faster queries. 

Enjoy SQL Server 2008,You can download the trial version from here