Click here to close now.



Welcome!

Microsoft Cloud Authors: Pat Romanski, Elizabeth White, Liz McMillan, Mihai Corbuleac, David Bermingham

Related Topics: Microsoft Cloud

Microsoft Cloud: Article

How to Avoid the Top Five SharePoint Performance Mistakes

SharePoint is growing quickly

SharePoint is without question a fast-growing platform and Microsoft is making lots of money with it. It’s been around for almost a decade and grew from a small list and document management application into an application development platform on top of ASP.NET using its own API to manage content in the SharePoint Content Database.

Over the years many things have changed – but some haven’t – like – SharePoint still uses a single database table to store ALL items in any SharePoint List. And this brings me straight into the #1 problem I have seen when working with companies that implemented their own solution based on SharePoint.

#1: Iterating through SPList Items
As a developer I get access to a SPList object – either using it from my current SPContext or creating a SPList object to access a list identified by its name. SPList provides an Items property that returns a SPListItemCollection object. The following code snippet shows one way to display the Title column of the first 100 items in the current SPList object:

SPList activeList = SPContext.Current.List;
for(int i=0;i<100 && i<activeList.Items.Count;i++) {
  SPListItem listItem = activeList.Items[i];
  htmlWriter.Write(listItem["Title"]);
}

Looks good – right? Although the above code works fine and performs great in a local environment it is the number 1 performance problem I’ve seen in custom SharePoint implementations. The problem is the way the Items property is accessed. The Items property queries ALL items from the Content Database for the current SPList and “unfortunately” does that every time we access the Items property. The retrieved items ARE NOT CACHED. In the loop example we access the Items property twice for every loop iteration – once to retrieve the Count, and once to access the actual Item identified by its index. Analyzing the actual ADO.NET Database activity of that loop shows us the following interesting result:

200 SQL Statements get executed when iterating through  SPList.Items

200 SQL Statements get executed when iterating through SPList.Items

Problem: The same SQL Statement is executed all over again which retrieves ALL items from the content database for this list. In my example above I had 200 SQL calls totalling up to more than 1s in SQL Execution Time.

Solution: The solution for that problem is rather easy but unfortunately still rarely used. Simply store the SPListItemCollection object returned by the Items property in a variable and use it in your loop:

SPListItemCollection items = SPContext.Current.List.Items;
for(int i=0;i<100 && i<items.Count;i++) {
  SPListItem listItem = items[i];
  htmlWriter.Write(listItem["Title"]);
}

This queries the database only once and we work on an in-memory collection of all retrieved items.

Further readings: The wrong way to iterate through SharePoint SPList Items and Performance Considerations when using the SharePoint Object Model

#2: Requesting too much data from the content database
It is convenient to access data from the Content Database using the SPList object. But – every time we do so we end up requesting ALL items of the list. Look closer at the SQL Statement that is shown in the example above. It starts with SELECT TOP 2147483648 and returns all defined columns in the current SPList.

Most developers I worked with were not aware that there is an easy option to only query the data that you really need using the SPQuery object. SPQuery allows you to:

a) limit the number of returned items
b) limit the number of returned columns
c) query specific items using CAML (Collaborative Markup Language)

Limit the number of returned items

If I only want to access the first 100 items in a list – or e.g.,: page through items in steps of 100 elements (in case I implement data paging in my WebParts) I can do that by using the SPQuery RowLimit and ListItemCollectionPosition property. Check out Page through SharePoint Lists for a full example:

SPQuery query = new SPQuery();
query.RowLimit = 100; // we want to retrieve 100 items

query.ListItemCollectionPosition = prevItems.ListItemCollectionPosition; // starting at a previous position
SPListItemCollection items = SPContext.Current.List.GetItems(query);
// now iterate through the items collection

The following screenshot shows us that SharePoint actually takes the RowLimit count and uses it in the SELECT TOP clause to limit the number of rows returned. It also uses the ListItemCollectionPosition in the WHERE clause to only retrieve elements with an ID > previous position.

SPQuery.RowLimit limits the number of records retrieved from the  SharePoint Content Database

SPQuery.RowLimit limits the number of records retrieved from the SharePoint Content Database

Limit the number of returned columns
If you only need certain columns from the List SPQuery.ViewFields can be used to specify which Columns to retrieve. By default – all columns are queried which causes extra stress on the database to retrieve the data, requires more network bandwidth to transfer the data from SQL Server to SharePoint, and consumes more memory in your ASP.NET Worker Process. Here is an example of how to use the ViewFields property to only retrieve the ID, Text Field and XZY Column:

SPQuery query = new SPQuery();
query.ViewFields = "<FieldRef Name='ID'/><FieldRef Name='Text Field'/><FieldRef Name='XYZ'/>";

Looking at the generated SQL makes the difference to the default query mode obvious:

SELECT clause only selects those columns defined in SPView or  ViewFields

SELECT clause only selects those columns defined in SPView or ViewFields

Query specific elements using CAML
CAML allows you to be very specific in which elements you want to retrieve. The syntax is a bit “bloated” (that is my personal opinion) as it uses XML to define a SQL WHERE like clause. Here is an example of such a query:

SPQuery query = new SPQuery();
query.Query = “<Where><Eq><FieldRef Name=\”ID\”/><Value Type=\”Number\”>15</Value></Eq></Where>”;

As I said it is a bit “bloated” but it hey – it works :-)

Problem: The main problem that I’ve seen is that developers usually go straight on and only work through SPList to retrieve list items resulting in too much data retrieved from the Content Database

Solution: Use the SPQuery object and its features to limit the number of elements and columns

Further readings: Only request the data you really need and Page through SharePoint lists

#3: Memory Leaks with SPSite and SPWeb
In the very beginning I said “many things have changed – but some haven’t”. SharePoint still uses COM Components for some of its core features – a relict of “the ancient times”. While there is nothing wrong with COM, there is with memory management of COM Objects. SPSite and SPWeb objects are used by developers to gain access to the Content Database. What is not obvious is that these objects have to be explicitly disposed in order for the COM objects to be released from memory once no longer needed.

Problem: The problem SharePoint installations run into by not disposing SPSite and SPWeb objects is that the ASP.NET Worker Process is leaking memory (native and managed) and will end up being recycled by IIS in case we run out of memory. Recycling means losing all current user sessions and paying a performance penalty for those users that hit the worker process again after recycling is finished (first requests are slow during startup).

Solution: Monitor your memory usage to identify whether you have a memory leak or not. Use a memory profiler to identify which objects are leaking and what is creating them. In case of SPSite and SPWeb you should follow the Best Practices as described on MSDN. Microsoft also provides a tool to identify leaking SPSite and SPWeb objects called SPDisposeCheck.

The following screenshot shows the process of monitoring memory counters, using memory dumps and analyzing memory allocations using dynaTrace :

Identifying leaking SPSite and SPWeb Objects

Identifying leaking SPSite and SPWeb Objects

Further reading: Identifying memory problems introduced by custom code

#4: Index Columns are not necessarily improving performance
When I did my SharePoint research during my first SharePoint engagements I discovered several “interesting” implementation details about SharePoint. Having only a single database table to store all List Items makes it a bit tricky to propagate index column definitions down to SQL Server. Why is that? If we look at the AllUserData table in your SharePoint Content Database we see that this table really contains columns for all possible columns that you can ever have in any SharePoint list. We find for instance 64 nvarchar, 16 ints, 12 floats, …

If you define an index on the first text column in your “My SharePoint List 1” and another index column on the 2nd number column in your “My SharePoint List 2” and so on and so on you would end up having database indices defined on pretty much every column in your Content Database. Check out the further reading link to a previous blog of mine – it gives you a good overview of how the AllUserData table looks like.

Problem: Index Columns can speed up access to SharePoint Lists – but – due to the nature of the implementation of Indices in SharePoint we have the following limitations:
a) for every defined index SharePoint stores the index value for every list item in a separate table. Having a list with let’s say 10000 items means that we have 10000 rows in AllUserData and 10000 additional rows in the NameValuePair table (used for indexing)
b) queries only make use of the first index column on a table. Additional index columns are not used to speed up database access

Solution: Really think about your index columns. They definitely help in cases where you do lookups on text columns. Keep in mind the additional overhead of an index and that multiple indices don’t give you additional performance gain.

Further readings: How list column indices really work under the hood and More on column index and their performance impact

#5: SharePoint is not a relational database for high volume transactional processing
This problem should actually be #1 on my list and here is why: In the last 2 years I’ve run into several companies that made one big mistake: they thought SharePoint is the most flexible database on earth as they could define Lists on the fly – modifying them as they needed them without worrying about the underlying database schema or without worrying to update the database access logic after every schema change. Besides this belief these companies have something else in common: They had to rewrite their SharePoint application by replacing the Content Database in most parts of their applications with a “regular” relational database.

Problem: If you read through the previous four problem points it should be obvious why SharePoint is not a relational database and why it should not be used for high-volume transactional processing. Every data element is stored in a single table. Database indices are implemented using a second table that is then joined to the main table. Concurrent access to different lists is a problem because the data comes from the same table.

Solution: Before starting a SharePoint project really think about what data you have – how frequently you need it and how many different users modify it. If you have many data items that change frequently you should really consider using your own relational database. The great thing about SharePoint is that you are not bound to the Content Database. You can practically do whatever you want including access to any external database. Be smart and don’t end up rewriting your application after you’ve invested too much time already.

Further reading: Monitoring individual List usage and performance and How list column indices really work under the hood

Final words
These findings are a summary of the work I did on SharePoint in the last two years. I’ve spoken at several conferences and worked with different companies helping them to speed up their SharePoint installations. Follow the links under Further Readings in the individual paragraphs or simply check out all SharePoint related blog articles. Some of my SharePoint Conference talks have been recorded (video + screen capture) and you can watch them on the Conference Material Download page.

As SharePoint is built on the .NET Platform you might also be interested in my latest White Papers about Continuous Application Performance for Enterprise .NET Systems

Related reading:

  1. SharePoint: More on column index and their performance impact In my previous post I took a closer look into...
  2. SharePoint: List Performance – How list column indices really work under the hood Have you ever wondered what is really going on under...
  3. SharePoint: Only request data that you really need One of the main performance problems that we can witness...
  4. SharePoint: Lookup value Performance In SharePoint you can define lookup columns in your lists....
  5. SharePoint: Page through SharePoint lists SharePoint lists can contain thousands of items. We have all heard...

More Stories By Andreas Grabner

Andreas Grabner has been helping companies improve their application performance for 15+ years. He is a regular contributor within Web Performance and DevOps communities and a prolific speaker at user groups and conferences around the world. Reach him at @grabnerandi

@ThingsExpo Stories
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...
There are several IoTs: the Industrial Internet, Consumer Wearables, Wearables and Healthcare, Supply Chains, and the movement toward Smart Grids, Cities, Regions, and Nations. There are competing communications standards every step of the way, a bewildering array of sensors and devices, and an entire world of competing data analytics platforms. To some this appears to be chaos. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Bradley Holt, Developer Advocate a...
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, wh...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
SYS-CON Events announced today that Bsquare has been named “Silver Sponsor” of SYS-CON's @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. For more than two decades, Bsquare has helped its customers extract business value from a broad array of physical assets by making them intelligent, connecting them, and using the data they generate to optimize business processes.
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. Big Data at Cloud Expo - to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to imp...
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
It is one thing to build single industrial IoT applications, but what will it take to build the Smart Cities and truly society changing applications of the future? The technology won’t be the problem, it will be the number of parties that need to work together and be aligned in their motivation to succeed. In his Day 2 Keynote at @ThingsExpo, Henrik Kenani Dahlgren, Portfolio Marketing Manager at Ericsson, discussed how to plan to cooperate, partner, and form lasting all-star teams to change t...
Connected devices and the industrial internet are growing exponentially every year with Cisco expecting 50 billion devices to be in operation by 2020. In this period of growth, location-based insights are becoming invaluable to many businesses as they adopt new connected technologies. Knowing when and where these devices connect from is critical for a number of scenarios in supply chain management, disaster management, emergency response, M2M, location marketing and more. In his session at @Th...
SYS-CON Events announced today that ReadyTalk, a leading provider of online conferencing and webinar services, has been named Vendor Presentation Sponsor at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. ReadyTalk delivers audio and web conferencing services that inspire collaboration and enable the Future of Work for today’s increasingly digital and mobile workforce. By combining intuitive, innovative tec...
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
industrial company for a multi-year contract initially valued at over $4.0 million. In addition to DataV software, Bsquare will also provide comprehensive systems integration, support and maintenance services. DataV leverages advanced data analytics, predictive reasoning, data-driven diagnostics, and automated orchestration of remediation actions in order to improve asset uptime while reducing service and warranty costs.
Vidyo, Inc., has joined the Alliance for Open Media. The Alliance for Open Media is a non-profit organization working to define and develop media technologies that address the need for an open standard for video compression and delivery over the web. As a member of the Alliance, Vidyo will collaborate with industry leaders in pursuit of an open and royalty-free AOMedia Video codec, AV1. Vidyo’s contributions to the organization will bring to bear its long history of expertise in codec technolo...