Welcome!

.NET Authors: Yeshim Deniz, Carmen Gonzalez, Greg O'Connor, Pat Romanski, Elizabeth White

Related Topics: .NET

.NET: Article

How to Avoid the Top Five SharePoint Performance Mistakes

SharePoint is growing quickly

SharePoint is without question a fast-growing platform and Microsoft is making lots of money with it. It’s been around for almost a decade and grew from a small list and document management application into an application development platform on top of ASP.NET using its own API to manage content in the SharePoint Content Database.

Over the years many things have changed – but some haven’t – like – SharePoint still uses a single database table to store ALL items in any SharePoint List. And this brings me straight into the #1 problem I have seen when working with companies that implemented their own solution based on SharePoint.

#1: Iterating through SPList Items
As a developer I get access to a SPList object – either using it from my current SPContext or creating a SPList object to access a list identified by its name. SPList provides an Items property that returns a SPListItemCollection object. The following code snippet shows one way to display the Title column of the first 100 items in the current SPList object:

SPList activeList = SPContext.Current.List;
for(int i=0;i<100 && i<activeList.Items.Count;i++) {
  SPListItem listItem = activeList.Items[i];
  htmlWriter.Write(listItem["Title"]);
}

Looks good – right? Although the above code works fine and performs great in a local environment it is the number 1 performance problem I’ve seen in custom SharePoint implementations. The problem is the way the Items property is accessed. The Items property queries ALL items from the Content Database for the current SPList and “unfortunately” does that every time we access the Items property. The retrieved items ARE NOT CACHED. In the loop example we access the Items property twice for every loop iteration – once to retrieve the Count, and once to access the actual Item identified by its index. Analyzing the actual ADO.NET Database activity of that loop shows us the following interesting result:

200 SQL Statements get executed when iterating through  SPList.Items

200 SQL Statements get executed when iterating through SPList.Items

Problem: The same SQL Statement is executed all over again which retrieves ALL items from the content database for this list. In my example above I had 200 SQL calls totalling up to more than 1s in SQL Execution Time.

Solution: The solution for that problem is rather easy but unfortunately still rarely used. Simply store the SPListItemCollection object returned by the Items property in a variable and use it in your loop:

SPListItemCollection items = SPContext.Current.List.Items;
for(int i=0;i<100 && i<items.Count;i++) {
  SPListItem listItem = items[i];
  htmlWriter.Write(listItem["Title"]);
}

This queries the database only once and we work on an in-memory collection of all retrieved items.

Further readings: The wrong way to iterate through SharePoint SPList Items and Performance Considerations when using the SharePoint Object Model

#2: Requesting too much data from the content database
It is convenient to access data from the Content Database using the SPList object. But – every time we do so we end up requesting ALL items of the list. Look closer at the SQL Statement that is shown in the example above. It starts with SELECT TOP 2147483648 and returns all defined columns in the current SPList.

Most developers I worked with were not aware that there is an easy option to only query the data that you really need using the SPQuery object. SPQuery allows you to:

a) limit the number of returned items
b) limit the number of returned columns
c) query specific items using CAML (Collaborative Markup Language)

Limit the number of returned items

If I only want to access the first 100 items in a list – or e.g.,: page through items in steps of 100 elements (in case I implement data paging in my WebParts) I can do that by using the SPQuery RowLimit and ListItemCollectionPosition property. Check out Page through SharePoint Lists for a full example:

SPQuery query = new SPQuery();
query.RowLimit = 100; // we want to retrieve 100 items

query.ListItemCollectionPosition = prevItems.ListItemCollectionPosition; // starting at a previous position
SPListItemCollection items = SPContext.Current.List.GetItems(query);
// now iterate through the items collection

The following screenshot shows us that SharePoint actually takes the RowLimit count and uses it in the SELECT TOP clause to limit the number of rows returned. It also uses the ListItemCollectionPosition in the WHERE clause to only retrieve elements with an ID > previous position.

SPQuery.RowLimit limits the number of records retrieved from the  SharePoint Content Database

SPQuery.RowLimit limits the number of records retrieved from the SharePoint Content Database

Limit the number of returned columns
If you only need certain columns from the List SPQuery.ViewFields can be used to specify which Columns to retrieve. By default – all columns are queried which causes extra stress on the database to retrieve the data, requires more network bandwidth to transfer the data from SQL Server to SharePoint, and consumes more memory in your ASP.NET Worker Process. Here is an example of how to use the ViewFields property to only retrieve the ID, Text Field and XZY Column:

SPQuery query = new SPQuery();
query.ViewFields = "<FieldRef Name='ID'/><FieldRef Name='Text Field'/><FieldRef Name='XYZ'/>";

Looking at the generated SQL makes the difference to the default query mode obvious:

SELECT clause only selects those columns defined in SPView or  ViewFields

SELECT clause only selects those columns defined in SPView or ViewFields

Query specific elements using CAML
CAML allows you to be very specific in which elements you want to retrieve. The syntax is a bit “bloated” (that is my personal opinion) as it uses XML to define a SQL WHERE like clause. Here is an example of such a query:

SPQuery query = new SPQuery();
query.Query = “<Where><Eq><FieldRef Name=\”ID\”/><Value Type=\”Number\”>15</Value></Eq></Where>”;

As I said it is a bit “bloated” but it hey – it works :-)

Problem: The main problem that I’ve seen is that developers usually go straight on and only work through SPList to retrieve list items resulting in too much data retrieved from the Content Database

Solution: Use the SPQuery object and its features to limit the number of elements and columns

Further readings: Only request the data you really need and Page through SharePoint lists

#3: Memory Leaks with SPSite and SPWeb
In the very beginning I said “many things have changed – but some haven’t”. SharePoint still uses COM Components for some of its core features – a relict of “the ancient times”. While there is nothing wrong with COM, there is with memory management of COM Objects. SPSite and SPWeb objects are used by developers to gain access to the Content Database. What is not obvious is that these objects have to be explicitly disposed in order for the COM objects to be released from memory once no longer needed.

Problem: The problem SharePoint installations run into by not disposing SPSite and SPWeb objects is that the ASP.NET Worker Process is leaking memory (native and managed) and will end up being recycled by IIS in case we run out of memory. Recycling means losing all current user sessions and paying a performance penalty for those users that hit the worker process again after recycling is finished (first requests are slow during startup).

Solution: Monitor your memory usage to identify whether you have a memory leak or not. Use a memory profiler to identify which objects are leaking and what is creating them. In case of SPSite and SPWeb you should follow the Best Practices as described on MSDN. Microsoft also provides a tool to identify leaking SPSite and SPWeb objects called SPDisposeCheck.

The following screenshot shows the process of monitoring memory counters, using memory dumps and analyzing memory allocations using dynaTrace :

Identifying leaking SPSite and SPWeb Objects

Identifying leaking SPSite and SPWeb Objects

Further reading: Identifying memory problems introduced by custom code

#4: Index Columns are not necessarily improving performance
When I did my SharePoint research during my first SharePoint engagements I discovered several “interesting” implementation details about SharePoint. Having only a single database table to store all List Items makes it a bit tricky to propagate index column definitions down to SQL Server. Why is that? If we look at the AllUserData table in your SharePoint Content Database we see that this table really contains columns for all possible columns that you can ever have in any SharePoint list. We find for instance 64 nvarchar, 16 ints, 12 floats, …

If you define an index on the first text column in your “My SharePoint List 1” and another index column on the 2nd number column in your “My SharePoint List 2” and so on and so on you would end up having database indices defined on pretty much every column in your Content Database. Check out the further reading link to a previous blog of mine – it gives you a good overview of how the AllUserData table looks like.

Problem: Index Columns can speed up access to SharePoint Lists – but – due to the nature of the implementation of Indices in SharePoint we have the following limitations:
a) for every defined index SharePoint stores the index value for every list item in a separate table. Having a list with let’s say 10000 items means that we have 10000 rows in AllUserData and 10000 additional rows in the NameValuePair table (used for indexing)
b) queries only make use of the first index column on a table. Additional index columns are not used to speed up database access

Solution: Really think about your index columns. They definitely help in cases where you do lookups on text columns. Keep in mind the additional overhead of an index and that multiple indices don’t give you additional performance gain.

Further readings: How list column indices really work under the hood and More on column index and their performance impact

#5: SharePoint is not a relational database for high volume transactional processing
This problem should actually be #1 on my list and here is why: In the last 2 years I’ve run into several companies that made one big mistake: they thought SharePoint is the most flexible database on earth as they could define Lists on the fly – modifying them as they needed them without worrying about the underlying database schema or without worrying to update the database access logic after every schema change. Besides this belief these companies have something else in common: They had to rewrite their SharePoint application by replacing the Content Database in most parts of their applications with a “regular” relational database.

Problem: If you read through the previous four problem points it should be obvious why SharePoint is not a relational database and why it should not be used for high-volume transactional processing. Every data element is stored in a single table. Database indices are implemented using a second table that is then joined to the main table. Concurrent access to different lists is a problem because the data comes from the same table.

Solution: Before starting a SharePoint project really think about what data you have – how frequently you need it and how many different users modify it. If you have many data items that change frequently you should really consider using your own relational database. The great thing about SharePoint is that you are not bound to the Content Database. You can practically do whatever you want including access to any external database. Be smart and don’t end up rewriting your application after you’ve invested too much time already.

Further reading: Monitoring individual List usage and performance and How list column indices really work under the hood

Final words
These findings are a summary of the work I did on SharePoint in the last two years. I’ve spoken at several conferences and worked with different companies helping them to speed up their SharePoint installations. Follow the links under Further Readings in the individual paragraphs or simply check out all SharePoint related blog articles. Some of my SharePoint Conference talks have been recorded (video + screen capture) and you can watch them on the Conference Material Download page.

As SharePoint is built on the .NET Platform you might also be interested in my latest White Papers about Continuous Application Performance for Enterprise .NET Systems

Related reading:

  1. SharePoint: More on column index and their performance impact In my previous post I took a closer look into...
  2. SharePoint: List Performance – How list column indices really work under the hood Have you ever wondered what is really going on under...
  3. SharePoint: Only request data that you really need One of the main performance problems that we can witness...
  4. SharePoint: Lookup value Performance In SharePoint you can define lookup columns in your lists....
  5. SharePoint: Page through SharePoint lists SharePoint lists can contain thousands of items. We have all heard...

More Stories By Andreas Grabner

Andreas Grabner has more than a decade of experience as an architect and developer in the Java and .NET space. In his current role, Andi works as a Technology Strategist for Compuware and leads the Compuware APM Center of Excellence team. In his role he influences the Compuware APM product strategy and works closely with customers in implementing performance management solutions across the entire application lifecycle. He is a frequent speaker at technology conferences on performance and architecture-related topics, and regularly authors articles offering business and technology advice for Compuware’s About:Performance blog.

@ThingsExpo Stories

SUNNYVALE, Calif., Oct. 20, 2014 /PRNewswire/ -- Spansion Inc. (NYSE: CODE), a global leader in embedded systems, today added 96 new products to the Spansion® FM4 Family of flexible microcontrollers (MCUs). Based on the ARM® Cortex®-M4F core, the new MCUs boast a 200 MHz operating frequency and support a diverse set of on-chip peripherals for enhanced human machine interfaces (HMIs) and machine-to-machine (M2M) communications. The rich set of periphera...

WebRTC defines no default signaling protocol, causing fragmentation between WebRTC silos. SIP and XMPP provide possibilities, but come with considerable complexity and are not designed for use in a web environment. In his session at Internet of @ThingsExpo, Matthew Hodgson, technical co-founder of the Matrix.org, will discuss how Matrix is a new non-profit Open Source Project that defines both a new HTTP-based standard for VoIP & IM signaling and provides reference implementations.
SYS-CON Events announced today that Aria Systems, the recurring revenue expert, has been named "Bronze Sponsor" of SYS-CON's 15th International Cloud Expo®, which will take place on November 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Aria Systems helps leading businesses connect their customers with the products and services they love. Industry leaders like Pitney Bowes, Experian, AAA NCNU, VMware, HootSuite and many others choose Aria to power their recurring revenue business and deliver exceptional experiences to their customers.
The Internet of Things (IoT) is going to require a new way of thinking and of developing software for speed, security and innovation. This requires IT leaders to balance business as usual while anticipating for the next market and technology trends. Cloud provides the right IT asset portfolio to help today’s IT leaders manage the old and prepare for the new. Today the cloud conversation is evolving from private and public to hybrid. This session will provide use cases and insights to reinforce the value of the network in helping organizations to maximize their company’s cloud experience.
The Internet of Things (IoT) is making everything it touches smarter – smart devices, smart cars and smart cities. And lucky us, we’re just beginning to reap the benefits as we work toward a networked society. However, this technology-driven innovation is impacting more than just individuals. The IoT has an environmental impact as well, which brings us to the theme of this month’s #IoTuesday Twitter chat. The ability to remove inefficiencies through connected objects is driving change throughout every sector, including waste management. BigBelly Solar, located just outside of Boston, is trans...
SYS-CON Events announced today that Matrix.org has been named “Silver Sponsor” of Internet of @ThingsExpo, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Matrix is an ambitious new open standard for open, distributed, real-time communication over IP. It defines a new approach for interoperable Instant Messaging and VoIP based on pragmatic HTTP APIs and WebRTC, and provides open source reference implementations to showcase and bootstrap the new standard. Our focus is on simplicity, security, and supporting the fullest feature set.
Predicted by Gartner to add $1.9 trillion to the global economy by 2020, the Internet of Everything (IoE) is based on the idea that devices, systems and services will connect in simple, transparent ways, enabling seamless interactions among devices across brands and sectors. As this vision unfolds, it is clear that no single company can accomplish the level of interoperability required to support the horizontal aspects of the IoE. The AllSeen Alliance, announced in December 2013, was formed with the goal to advance IoE adoption and innovation in the connected home, healthcare, education, aut...
SYS-CON Events announced today that Red Hat, the world's leading provider of open source solutions, will exhibit at Internet of @ThingsExpo, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Red Hat is the world's leading provider of open source software solutions, using a community-powered approach to reliable and high-performing cloud, Linux, middleware, storage and virtualization technologies. Red Hat also offers award-winning support, training, and consulting services. As the connective hub in a global network of enterprises, partners, a...
The only place to be June 9-11 is Cloud Expo & @ThingsExpo 2015 East at the Javits Center in New York City. Join us there as delegates from all over the world come to listen to and engage with speakers & sponsors from the leading Cloud Computing, IoT & Big Data companies. Cloud Expo & @ThingsExpo are the leading events covering the booming market of Cloud Computing, IoT & Big Data for the enterprise. Speakers from all over the world will be hand-picked for their ability to explore the economic strategies that utility/cloud computing provides. Whether public, private, or in a hybrid form, clo...
Software AG helps organizations transform into Digital Enterprises, so they can differentiate from competitors and better engage customers, partners and employees. Using the Software AG Suite, companies can close the gap between business and IT to create digital systems of differentiation that drive front-line agility. We offer four on-ramps to the Digital Enterprise: alignment through collaborative process analysis; transformation through portfolio management; agility through process automation and integration; and visibility through intelligent business operations and big data.
Be Among the First 100 to Attend & Receive a Smart Beacon. The Physical Web is an open web project within the Chrome team at Google. Scott Jenson leads a team that is working to leverage the scalability and openness of the web to talk to smart devices. The Physical Web uses bluetooth low energy beacons to broadcast an URL wirelessly using an open protocol. Nearby devices can find all URLs in the room, rank them and let the user pick one from a list. Each device is, in effect, a gateway to a web page. This unlocks entirely new use cases so devices can offer tiny bits of information or simple i...
The Transparent Cloud-computing Consortium (abbreviation: T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data processing High speed and high quality networks, and dramatic improvements in computer processing capabilities, have greatly changed the nature of applications and made the storing and processing of data on the network commonplace.
The Internet of Things (IoT) is going to require a new way of thinking and of developing software for speed, security and innovation. This requires IT leaders to balance business as usual while anticipating for the next market and technology trends. Cloud provides the right IT asset portfolio to help today’s IT leaders manage the old and prepare for the new. Today the cloud conversation is evolving from private and public to hybrid. This session will provide use cases and insights to reinforce the value of the network in helping organizations to maximize their company’s cloud experience.
Things are being built upon cloud foundations to transform organizations. This CEO Power Panel at 15th Cloud Expo, moderated by Roger Strukhoff, Cloud Expo and @ThingsExpo conference chair, will address the big issues involving these technologies and, more important, the results they will achieve. How important are public, private, and hybrid cloud to the enterprise? How does one define Big Data? And how is the IoT tying all this together?
TechCrunch reported that "Berlin-based relayr, maker of the WunderBar, an Internet of Things (IoT) hardware dev kit which resembles a chunky chocolate bar, has closed a $2.3 million seed round, from unnamed U.S. and Switzerland-based investors. The startup had previously raised a €250,000 friend and family round, and had been on track to close a €500,000 seed earlier this year — but received a higher funding offer from a different set of investors, which is the $2.3M round it’s reporting."
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. Over the summer Gartner released its much anticipated annual Hype Cycle report and the big news is that Internet of Things has now replaced Big Data as the most hyped technology. Indeed, we're hearing more and more about this fascinating new technological paradigm. Every other IT news item seems to be about IoT and its implications on the future of digital busines...
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have s...
IoT is still a vague buzzword for many people. In his session at Internet of @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, will discuss the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. The presentation will also discuss how IoT is perceived by investors and how venture capitalist access this space. Other topics to discuss are barriers to success, what is new, what is old, and what the future may hold.
The Internet of Things needs an entirely new security model, or does it? Can we save some old and tested controls for the latest emerging and different technology environments? In his session at Internet of @ThingsExpo, Davi Ottenheimer, EMC Senior Director of Trust, will review hands-on lessons with IoT devices and reveal privacy options and a new risk balance you might not expect.