Welcome!

Microsoft Cloud Authors: Pat Romanski, Srinivasan Sundara Rajan, Glenn Rossman, Janakiram MSV, Steven Mandel

Related Topics: Microsoft Cloud

Microsoft Cloud: Article

How to Avoid the Top Five SharePoint Performance Mistakes

SharePoint is growing quickly

SharePoint is without question a fast-growing platform and Microsoft is making lots of money with it. It’s been around for almost a decade and grew from a small list and document management application into an application development platform on top of ASP.NET using its own API to manage content in the SharePoint Content Database.

Over the years many things have changed – but some haven’t – like – SharePoint still uses a single database table to store ALL items in any SharePoint List. And this brings me straight into the #1 problem I have seen when working with companies that implemented their own solution based on SharePoint.

#1: Iterating through SPList Items
As a developer I get access to a SPList object – either using it from my current SPContext or creating a SPList object to access a list identified by its name. SPList provides an Items property that returns a SPListItemCollection object. The following code snippet shows one way to display the Title column of the first 100 items in the current SPList object:

SPList activeList = SPContext.Current.List;
for(int i=0;i<100 && i<activeList.Items.Count;i++) {
  SPListItem listItem = activeList.Items[i];
  htmlWriter.Write(listItem["Title"]);
}

Looks good – right? Although the above code works fine and performs great in a local environment it is the number 1 performance problem I’ve seen in custom SharePoint implementations. The problem is the way the Items property is accessed. The Items property queries ALL items from the Content Database for the current SPList and “unfortunately” does that every time we access the Items property. The retrieved items ARE NOT CACHED. In the loop example we access the Items property twice for every loop iteration – once to retrieve the Count, and once to access the actual Item identified by its index. Analyzing the actual ADO.NET Database activity of that loop shows us the following interesting result:

200 SQL Statements get executed when iterating through  SPList.Items

200 SQL Statements get executed when iterating through SPList.Items

Problem: The same SQL Statement is executed all over again which retrieves ALL items from the content database for this list. In my example above I had 200 SQL calls totalling up to more than 1s in SQL Execution Time.

Solution: The solution for that problem is rather easy but unfortunately still rarely used. Simply store the SPListItemCollection object returned by the Items property in a variable and use it in your loop:

SPListItemCollection items = SPContext.Current.List.Items;
for(int i=0;i<100 && i<items.Count;i++) {
  SPListItem listItem = items[i];
  htmlWriter.Write(listItem["Title"]);
}

This queries the database only once and we work on an in-memory collection of all retrieved items.

Further readings: The wrong way to iterate through SharePoint SPList Items and Performance Considerations when using the SharePoint Object Model

#2: Requesting too much data from the content database
It is convenient to access data from the Content Database using the SPList object. But – every time we do so we end up requesting ALL items of the list. Look closer at the SQL Statement that is shown in the example above. It starts with SELECT TOP 2147483648 and returns all defined columns in the current SPList.

Most developers I worked with were not aware that there is an easy option to only query the data that you really need using the SPQuery object. SPQuery allows you to:

a) limit the number of returned items
b) limit the number of returned columns
c) query specific items using CAML (Collaborative Markup Language)

Limit the number of returned items

If I only want to access the first 100 items in a list – or e.g.,: page through items in steps of 100 elements (in case I implement data paging in my WebParts) I can do that by using the SPQuery RowLimit and ListItemCollectionPosition property. Check out Page through SharePoint Lists for a full example:

SPQuery query = new SPQuery();
query.RowLimit = 100; // we want to retrieve 100 items

query.ListItemCollectionPosition = prevItems.ListItemCollectionPosition; // starting at a previous position
SPListItemCollection items = SPContext.Current.List.GetItems(query);
// now iterate through the items collection

The following screenshot shows us that SharePoint actually takes the RowLimit count and uses it in the SELECT TOP clause to limit the number of rows returned. It also uses the ListItemCollectionPosition in the WHERE clause to only retrieve elements with an ID > previous position.

SPQuery.RowLimit limits the number of records retrieved from the  SharePoint Content Database

SPQuery.RowLimit limits the number of records retrieved from the SharePoint Content Database

Limit the number of returned columns
If you only need certain columns from the List SPQuery.ViewFields can be used to specify which Columns to retrieve. By default – all columns are queried which causes extra stress on the database to retrieve the data, requires more network bandwidth to transfer the data from SQL Server to SharePoint, and consumes more memory in your ASP.NET Worker Process. Here is an example of how to use the ViewFields property to only retrieve the ID, Text Field and XZY Column:

SPQuery query = new SPQuery();
query.ViewFields = "<FieldRef Name='ID'/><FieldRef Name='Text Field'/><FieldRef Name='XYZ'/>";

Looking at the generated SQL makes the difference to the default query mode obvious:

SELECT clause only selects those columns defined in SPView or  ViewFields

SELECT clause only selects those columns defined in SPView or ViewFields

Query specific elements using CAML
CAML allows you to be very specific in which elements you want to retrieve. The syntax is a bit “bloated” (that is my personal opinion) as it uses XML to define a SQL WHERE like clause. Here is an example of such a query:

SPQuery query = new SPQuery();
query.Query = “<Where><Eq><FieldRef Name=\”ID\”/><Value Type=\”Number\”>15</Value></Eq></Where>”;

As I said it is a bit “bloated” but it hey – it works :-)

Problem: The main problem that I’ve seen is that developers usually go straight on and only work through SPList to retrieve list items resulting in too much data retrieved from the Content Database

Solution: Use the SPQuery object and its features to limit the number of elements and columns

Further readings: Only request the data you really need and Page through SharePoint lists

#3: Memory Leaks with SPSite and SPWeb
In the very beginning I said “many things have changed – but some haven’t”. SharePoint still uses COM Components for some of its core features – a relict of “the ancient times”. While there is nothing wrong with COM, there is with memory management of COM Objects. SPSite and SPWeb objects are used by developers to gain access to the Content Database. What is not obvious is that these objects have to be explicitly disposed in order for the COM objects to be released from memory once no longer needed.

Problem: The problem SharePoint installations run into by not disposing SPSite and SPWeb objects is that the ASP.NET Worker Process is leaking memory (native and managed) and will end up being recycled by IIS in case we run out of memory. Recycling means losing all current user sessions and paying a performance penalty for those users that hit the worker process again after recycling is finished (first requests are slow during startup).

Solution: Monitor your memory usage to identify whether you have a memory leak or not. Use a memory profiler to identify which objects are leaking and what is creating them. In case of SPSite and SPWeb you should follow the Best Practices as described on MSDN. Microsoft also provides a tool to identify leaking SPSite and SPWeb objects called SPDisposeCheck.

The following screenshot shows the process of monitoring memory counters, using memory dumps and analyzing memory allocations using dynaTrace :

Identifying leaking SPSite and SPWeb Objects

Identifying leaking SPSite and SPWeb Objects

Further reading: Identifying memory problems introduced by custom code

#4: Index Columns are not necessarily improving performance
When I did my SharePoint research during my first SharePoint engagements I discovered several “interesting” implementation details about SharePoint. Having only a single database table to store all List Items makes it a bit tricky to propagate index column definitions down to SQL Server. Why is that? If we look at the AllUserData table in your SharePoint Content Database we see that this table really contains columns for all possible columns that you can ever have in any SharePoint list. We find for instance 64 nvarchar, 16 ints, 12 floats, …

If you define an index on the first text column in your “My SharePoint List 1” and another index column on the 2nd number column in your “My SharePoint List 2” and so on and so on you would end up having database indices defined on pretty much every column in your Content Database. Check out the further reading link to a previous blog of mine – it gives you a good overview of how the AllUserData table looks like.

Problem: Index Columns can speed up access to SharePoint Lists – but – due to the nature of the implementation of Indices in SharePoint we have the following limitations:
a) for every defined index SharePoint stores the index value for every list item in a separate table. Having a list with let’s say 10000 items means that we have 10000 rows in AllUserData and 10000 additional rows in the NameValuePair table (used for indexing)
b) queries only make use of the first index column on a table. Additional index columns are not used to speed up database access

Solution: Really think about your index columns. They definitely help in cases where you do lookups on text columns. Keep in mind the additional overhead of an index and that multiple indices don’t give you additional performance gain.

Further readings: How list column indices really work under the hood and More on column index and their performance impact

#5: SharePoint is not a relational database for high volume transactional processing
This problem should actually be #1 on my list and here is why: In the last 2 years I’ve run into several companies that made one big mistake: they thought SharePoint is the most flexible database on earth as they could define Lists on the fly – modifying them as they needed them without worrying about the underlying database schema or without worrying to update the database access logic after every schema change. Besides this belief these companies have something else in common: They had to rewrite their SharePoint application by replacing the Content Database in most parts of their applications with a “regular” relational database.

Problem: If you read through the previous four problem points it should be obvious why SharePoint is not a relational database and why it should not be used for high-volume transactional processing. Every data element is stored in a single table. Database indices are implemented using a second table that is then joined to the main table. Concurrent access to different lists is a problem because the data comes from the same table.

Solution: Before starting a SharePoint project really think about what data you have – how frequently you need it and how many different users modify it. If you have many data items that change frequently you should really consider using your own relational database. The great thing about SharePoint is that you are not bound to the Content Database. You can practically do whatever you want including access to any external database. Be smart and don’t end up rewriting your application after you’ve invested too much time already.

Further reading: Monitoring individual List usage and performance and How list column indices really work under the hood

Final words
These findings are a summary of the work I did on SharePoint in the last two years. I’ve spoken at several conferences and worked with different companies helping them to speed up their SharePoint installations. Follow the links under Further Readings in the individual paragraphs or simply check out all SharePoint related blog articles. Some of my SharePoint Conference talks have been recorded (video + screen capture) and you can watch them on the Conference Material Download page.

As SharePoint is built on the .NET Platform you might also be interested in my latest White Papers about Continuous Application Performance for Enterprise .NET Systems

Related reading:

  1. SharePoint: More on column index and their performance impact In my previous post I took a closer look into...
  2. SharePoint: List Performance – How list column indices really work under the hood Have you ever wondered what is really going on under...
  3. SharePoint: Only request data that you really need One of the main performance problems that we can witness...
  4. SharePoint: Lookup value Performance In SharePoint you can define lookup columns in your lists....
  5. SharePoint: Page through SharePoint lists SharePoint lists can contain thousands of items. We have all heard...

More Stories By Andreas Grabner

Andreas Grabner has been helping companies improve their application performance for 15+ years. He is a regular contributor within Web Performance and DevOps communities and a prolific speaker at user groups and conferences around the world. Reach him at @grabnerandi

@ThingsExpo Stories
SYS-CON Events announced today that Numerex Corp, a leading provider of managed enterprise solutions enabling the Internet of Things (IoT), will exhibit at the 19th International Cloud Expo | @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Numerex Corp. (NASDAQ:NMRX) is a leading provider of managed enterprise solutions enabling the Internet of Things (IoT). The Company's solutions produce new revenue streams or create operating...
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
Data is an unusual currency; it is not restricted by the same transactional limitations as money or people. In fact, the more that you leverage your data across multiple business use cases, the more valuable it becomes to the organization. And the same can be said about the organization’s analytics. In his session at 19th Cloud Expo, Bill Schmarzo, CTO for the Big Data Practice at EMC, will introduce a methodology for capturing, enriching and sharing data (and analytics) across the organizati...
DevOps at Cloud Expo, taking place Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long dev...
I'm a lonely sensor. I spend all day telling the world how I'm feeling, but none of the other sensors seem to care. I want to be connected. I want to build relationships with other sensors to be more useful for my human. I want my human to understand that when my friends next door are too hot for a while, I'll soon be flaming. And when all my friends go outside without me, I may be left behind. Don't just log my data; use the relationship graph. In his session at @ThingsExpo, Ryan Boyd, Engi...
The vision of a connected smart home is becoming reality with the application of integrated wireless technologies in devices and appliances. The use of standardized and TCP/IP networked wireless technologies in line-powered and battery operated sensors and controls has led to the adoption of radios in the 2.4GHz band, including Wi-Fi, BT/BLE and 802.15.4 applied ZigBee and Thread. This is driving the need for robust wireless coexistence for multiple radios to ensure throughput performance and th...
The Internet of Things can drive efficiency for airlines and airports. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Sudip Majumder, senior director of development at Oracle, will discuss the technical details of the connected airline baggage and related social media solutions. These IoT applications will enhance travelers' journey experience and drive efficiency for the airlines and the airports. The session will include a working demo and a technical d...
SYS-CON Events announced today the Enterprise IoT Bootcamp, being held November 1-2, 2016, in conjunction with 19th Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA. Combined with real-world scenarios and use cases, the Enterprise IoT Bootcamp is not just based on presentations but with hands-on demos and detailed walkthroughs. We will introduce you to a variety of real world use cases prototyped using Arduino, Raspberry Pi, BeagleBone, Spark, and Intel Edison. Y...
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
If you’re responsible for an application that depends on the data or functionality of various IoT endpoints – either sensors or devices – your brand reputation depends on the security, reliability, and compliance of its many integrated parts. If your application fails to deliver the expected business results, your customers and partners won't care if that failure stems from the code you developed or from a component that you integrated. What can you do to ensure that the endpoints work as expect...
SYS-CON Events announced today that China Unicom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. China United Network Communications Group Co. Ltd ("China Unicom") was officially established in 2009 on the basis of the merger of former China Netcom and former China Unicom. China Unicom mainly operates a full range of telecommunications services including mobile broadband (GSM, WCDMA, LTE F...
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
The Transparent Cloud-computing Consortium (abbreviation: T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data processing High speed and high quality networks, and dramatic improvements in computer processing capabilities, have greatly changed the nature of applications and made the storing and processing of data on the network commonplace.
SYS-CON Events announced today that SoftLayer, an IBM Company, has been named “Gold Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. SoftLayer, an IBM Company, provides cloud infrastructure as a service from a growing number of data centers and network points of presence around the world. SoftLayer’s customers range from Web startups to global enterprises.
Digital innovation is the next big wave of business transformation based on digital technologies of which IoT and Big Data are key components, For example: Business boundary innovation is a challenge to excavate third-party business value using IoT and BigData, like Nest Business structure innovation may propose re-building business structure from scratch, as Uber does in the taxicab industry The social model innovation is also a big challenge to the new social architecture with the design fr...
SYS-CON Events announced today that Pulzze Systems will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Pulzze Systems, Inc. provides infrastructure products for the Internet of Things to enable any connected device and system to carry out matched operations without programming. For more information, visit http://www.pulzzesystems.com.
IoT is fundamentally transforming the auto industry, turning the vehicle into a hub for connected services, including safety, infotainment and usage-based insurance. Auto manufacturers – and businesses across all verticals – have built an entire ecosystem around the Connected Car, creating new customer touch points and revenue streams. In his session at @ThingsExpo, Macario Namie, Head of IoT Strategy at Cisco Jasper, will share real-world examples of how IoT transforms the car from a static p...
Big Data has been changing the world. IoT fuels the further transformation recently. How are Big Data and IoT related? In his session at @BigDataExpo, Tony Shan, a renowned visionary and thought leader, will explore the interplay of Big Data and IoT. He will anatomize Big Data and IoT separately in terms of what, which, why, where, when, who, how and how much. He will then analyze the relationship between IoT and Big Data, specifically the drilldown of how the 4Vs of Big Data (Volume, Variety,...
Video experiences should be unique and exciting! But that doesn’t mean you need to patch all the pieces yourself. Users demand rich and engaging experiences and new ways to connect with you. But creating robust video applications at scale can be complicated, time-consuming and expensive. In his session at @ThingsExpo, Zohar Babin, Vice President of Platform, Ecosystem and Community at Kaltura, will discuss how VPaaS enables you to move fast, creating scalable video experiences that reach your...
The many IoT deployments around the world are busy integrating smart devices and sensors into their enterprise IT infrastructures. Yet all of this technology – and there are an amazing number of choices – is of no use without the software to gather, communicate, and analyze the new data flows. Without software, there is no IT. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists will look at the protocols that communicate data and the emerging data analy...