Welcome!

Microsoft Cloud Authors: Lori MacVittie, Elizabeth White, Yeshim Deniz, Serafima Al, Janakiram MSV

Related Topics: Microsoft Cloud

Microsoft Cloud: Article

VS2010 Load Testing for Distributed and Heterogeneous Applications

Microsoft added new interfaces for performance management solutions

Visual Studio 2010 is almost here – Microsoft just released the first Release Candidate which looks pretty solid and good. Microsoft added new interfaces for performance management solutions like dynaTrace to extend the Web- and Load-Testing capabilities (check out Ed Glas’s blog on what’s in VSTS Load Testing) to go beyond .NET environments and deeper than what Load Testing Reports tell you about the performance of the tested application.

But before we go into what can be done by extending Visual Studio – lets have a look of what we get out of the box:

Standard Load Testing Reports from Visual Studio 2010
While running a load-test Visual Studio 2010 is collecting all sorts of information. Starting from the response times of the executed requests, performance counters of the tested application infrastructure (like CPU, Memory, I/O, …) and also the health of your load-testing infrastructure (load controller and agents). In my scenario I run a 4 tier (2 JVMs, 2 CLRs) web application. The 4 tiers communicate via SOAP Web Services (Axis->ASMX). The frontend web application is implemented using Java Servlets. I run a 15 minute load test with increasing load. The test is structured into multiple different transactions, e.g.: Home Page, Search, Login, BuyDirect, … – While running my test I also monitor all relevant performance counters from the application server and the load testing infrastructure. Visual Studio 2010 allows me to monitor the current state of the Load Test via configurable graphs as shown here:

Visual Studio Load Testing Graphs

Visual Studio Load Testing Graphs

The graphs show that response times of some (not all) of my transactions increase with increasing user load. It also highlights that CPU usage on my application server became a problem (exceeds 80% with ~20 concurrent users). At the end of the load test a summary report highlights what load was executed against the application – which errors happened and which pages performed slowest:

Load Testing Summary Report

Load Testing Summary Report

Switching to the Tables view gives a detailed breakdown into individual result dimension, e.g.: Transactions, Pages, Errors, … :

Visual Studio Load Testing Summary Tables

Visual Studio Load Testing Summary Tables

From the table view we can make the following observations:

  • 553 page requests exceeded my rule of 200ms per page
  • the 553 pages were the menu.do, netpay.do and userlogin.do (you can see this when you look at the individual error requests)
  • The LastMinute transaction was by far the slowest with 1.41s average response time and a max of 5.64s

What we don’t know is WHY THESE TRANSACTIONS ARE SLOW: The performance counters indicate that CPU is a potential problem but it doesn’t give us an indication what caused the CPU overhead and whether this is something that can be fixed or whether we are just running against our system performance boundaries.

Performance Reports by dynaTrace captured during Visual Studio 2010 Load Test
dynaTrace customers can download the Visual Studio 2010 plugin on the dynaTrace Community Portal. The package includes a Visual Studio Add-In and a Visual Studio Testing Plugin Library that extends its Web- and Load-Testing capabilities. We also offer the Automatic Session Analysis plugin that helps in analyzing data captured during longer load tests.

I used dynaTrace Test Center Edition on my 4 tier application while running the load test. The Visual Studio 2010 plugin made sure that dynaTrace automatically captured all server-side transactions (PurePath’s) in a dynaTrace Session. It also made sure that the same transaction names used in the Web Test script were passed on to dynaTrace.

While running the load test the Load Testing Performance Dashboard that I’ve created for my application allows me to watch the requests that come in and the memory consumption on each of my JVMs and CLRs. I can also see which Layers of my application contribute to the performance – with layers being ADO.NET, ASP.NET, SharePoint, Servlets, JDBC, Web Services, RMI, .NET Remoting, … – dynaTrace automatically detects these layers and it helps me to understand which components/layers of my app actually consume most of the execution time and how increasing load is affecting these components individually. Besides that I also watch the number of SQL statements executed (whether via Java or .NET) and also the number of Exceptions that happen:

dynaTrace Load Testing Performance Dashboard

dynaTrace Load Testing Performance Dashboard

On the top left I see the individual transaction response times and the accumulated transaction counts underneath. These are the number of incoming requests were it is easy to see how VS2010 increased the load during my test.
On the top right I see the memory usage of my two JVMs and underneath the memory usage of my two CLRs (seems I have a nice memory leak in my 2nd JVM and one very “quiet” CLR.

The bottom left chart (titled with Layer Breakdown) now shows me what’s going on within my application with increasing load. I can see that my application scales well until a certain user load – but then the Web Service Layer (dark gray color) starts performing much worse than all other involved application layers.

On the bottom right the number of database statements and number of exceptions show me that these counters increase linearly with increasing load – but – it seems we have quite a lot database queries (up to 350/second) and we also have quite a lot exceptions that we should investigate.

After the load test is finished the first report that I pull up is a report that shows me the slowest web transactions grouped by the transaction names used in Visual Studio:

dynaTrace Performance Report per Web Transaction

dynaTrace Performance Report per Web Transaction

I can see that the LastMinute is indeed the slowest transaction with a max of 5.6 seconds. The great thing about this report is that we get a detailed breakdown of these top transactions into application layers, database calls and method calls. We can immediately see that Java Web Services are the highest performance contributor to the Last Minute transaction. We also see that we have several thousand database queries for the 448 requests to this transaction and we also see which Java & .NET methods contributed to the execution time. A click on Slowest Page opens the PurePath Dashlet showing every individual transaction that got executed. Sorting it by duration shows the big variance between the execution times. The PurePath Hot Spot View makes it easy to spot the most contributing methods in the slowest transaction:

Individual PurePaths of slowest running Transactions showing a big  variance

Individual PurePaths of slowest running Transactions showing a big variance

With the PurePath Comparison feature I go one step further to find out what the difference between two transactions that show a big execution time difference are:

Comparing two transactions and identify the difference

Comparing two transactions and identify the difference

Visually in the Chart as well as in the PurePath Comparison Tree we see that getting the SpecialOffer’s and all calls in that context (creating the web service and calling it) make up most of the time difference. The difference table on the bottom lists all timing and structural differences between these two PurePaths giving even more insight into where else we have differences.

Show me the PurePath to individual failed Web Requests
In your VS2010 Run Configuration for your load test you can specify to store detailed response results in a SQL Database. This allows you to look up individual failed transactions including the actual HTTP traffic and all associated timings after the load test is finished. In my case I had another slow transaction type called BuyDirect. Via the VS2010 Load Testing Report I open individual failed transactions and analyze the individual requests that were slow:

Problematic Request from the Load Test with linkage to the  dynaTrace PurePath

Problematic Request from the Load Test with linkage to the dynaTrace PurePath

The result view shows me that the request took 1.988s.  The dynaTrace VS2010 Plugin adds a new tab in the Results Viewer allowing me to open the captured PurePath for that particular slow request by clicking on a PurePath link. Clicking on this link opens the PurePath in the dynaTrace Client:

Long running Heterogenous Transaction opened from Visual Studio

Long running Heterogeneous Transaction opened from Visual Studio

We can easily spot where the time is spent in this transaction – it is the web service call from the 2nd JVM (GoSpaceBackend) to the CLR that hosts the Web Service (DotNetPayFrontend). One of the problems also seems to be related to the exceptions that happen when calling the web service. These are exceptions that didn’t make it up to our own logging framework as they were handled internally by Axis but are caused by a configuration issue (we can look at the full exception stack trace here to find that out). With one further click I go ahead and look at the Sequence Diagram of this transaction. This diagram provides a better overview of the interactions between my 4 different servers:

dynaTrace Sequence Diagram showing interactions between servers for  a single transaction

dynaTrace Sequence Diagram showing interactions between servers for a single transaction

The sequence diagram goes on beyond what’s in the screenshot – but I guess you get the idea that we have a very chatty transaction here.

The dynaTrace VS2010 Plugin allows me to drill down to the problematic methods in a distributed heterogeneous transaction within a matter of seconds saving me a lot of time analyzing the problem based on the load testing report alone.

Share results with Developers and lookup problems in Source Code
Now we have all this great information and already found several hotspots that our developers should look into. Instead of giving my developers access to my test environment I simply export the captured data to a dynaTrace Session file and attach it to a JIRA issue (or whatever bug tracking tool you use) that I assign my dev. I can either export all captured data (PurePaths and performance counters) or be more specific and only export those PurePaths that have been identified as being problematic.

Development picks up the dynaTrace Session file, imports it into their local dynaTrace Client and analyzes the same granular data as we analyzed in our test environment. Having the dynaTrace Visual Studio 2010 Plugin installed allows the developer to Lookup individual methods in Visual Studio starting from the PurePath or Methods Dashlet in the dynaTrace Client:

Lookup source code of problematic method

Lookup source code of problematic method

The dynaTrace Plugin in Visual Studio – where you have to have your solution file opened – searches for the selected method, opens the source code file and sets the cursor to that method:

Problematic source code method in Visual Studio 2010 Editor

Problematic source code method in Visual Studio 2010 Editor

The data is easily shareable with anybody that needs to look at it. Within a matter of seconds the developer ends up at the source code line within Visual Studio 2010 that represents a problematic method in terms of performance. The dev also has all the contextual information on hand that shows why individual executions of the same transaction were faster than others, as the PurePath’s include information like method arguments, HTTP parameters, SQL Statements with Bind Variables, Exception Stack Traces, … -> this is all information that developers will love you for :-)

Identify Regressions across Test Runs
When running continuous load tests against different builds we expect performance to get better and better. But what if that is not the case? What has changed from the last build to the current? Which components don’t perform as well as they did the build before? Has the way we access the database changed? Is it an algorithm in custom code that takes too much time or is it a new 3rd party library that was introduced with this build that slows everything down?

The Automatic Session Analysis plugin also analyzes data across two load testing sessions generating a report that highlights the differences between these two sessions. The following screenshot shows the result of a load testing regression analysis:

Regression Analysis by comparing two load testing sessions

Regression Analysis by comparing two load testing sessions

It shows us which transactions were actually executed in the latest (top left) and previous (top right) build. In the middle we get an overview about which layers/components contributed to the performance in each of the two sessions and also shows a side by-side comparison (center) where the bars tell us which components performed faster or slower. It seems we had some serious performance decrease in most of our components. On the bottom we additionally see a comparison between executed database statements and methods. Similar to what I showed in the previous sections we would drill into more details from this report to analyze more details.

In Summary
Visual Studio 2010 is a good tool for performing load tests against .NET or Java Web Applications. The Load Testing Reports have been improved in this version and allow you to get a better understanding about the performance of your application. For multi-tier or heterogeneous applications like the one I used in my scenario it is now easy to go beyond the standard load testing reports by using an Application Performance Management Solution like dynaTrace. The combination of a load testing solution and an APM solution helps you to not only know that you have a performance problem but it allows you to identify the problem faster and therefore reduce test cycles and time spent in the testing phase.

There is more to read if you are interesting in these topics: White Paper on how to Automate Load Testing and Problem Analysis, webinars with Novell and Zappos that use a combination of a Load Testing Solution and dynaTrace to speed up their testing process as well as and additional blog posts called 101 on Load-Testing.

Feedback is always welcome and appreciated – thanks for reading all the way to the end :-)

Related reading :

  1. Getting ready for TechReady8: Load- and Web-Testing with VSTS and dynaTrace I’ve been invited by Microsoft to show dynaTrace’s integration into...
  2. Visual Studio Team System for Unit-, Web- and Load-Testing with dynaTrace Last week I was given the opportunity to meet the...
  3. Performance Analysis: Identify GC bottlenecks in distributed heterogeneous environments Garbage Collection can have a major impact on application performance....
  4. Boston .NET User Group: Load and Performance Testing: How to do Transactional Root-Cause Analysis with Visual Studio Team System for Testers I am going to present at the next Boston .NET...
  5. How to extend Visual Studio 2010 Web- and Load-Testing with Transactional Tracing Microsoft recently published the first official beta build of Visual...

More Stories By Andreas Grabner

Andreas Grabner has been helping companies improve their application performance for 15+ years. He is a regular contributor within Web Performance and DevOps communities and a prolific speaker at user groups and conferences around the world. Reach him at @grabnerandi

@ThingsExpo Stories
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
The Founder of NostaLab and a member of the Google Health Advisory Board, John is a unique combination of strategic thinker, marketer and entrepreneur. His career was built on the "science of advertising" combining strategy, creativity and marketing for industry-leading results. Combined with his ability to communicate complicated scientific concepts in a way that consumers and scientists alike can appreciate, John is a sought-after speaker for conferences on the forefront of healthcare science,...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Archi...
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
DXWorldEXPO LLC announced today that the upcoming DXWorldEXPO | CloudEXPO New York event will feature 10 companies from Poland to participate at the "Poland Digital Transformation Pavilion" on November 12-13, 2018.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
The best way to leverage your CloudEXPO | DXWorldEXPO presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering CloudEXPO | DXWorldEXPO will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at CloudEXPO. Product announcements during our show provide your company with the most reach through our targeted audienc...
JETRO showcased Japan Digital Transformation Pavilion at SYS-CON's 21st International Cloud Expo® at the Santa Clara Convention Center in Santa Clara, CA. The Japan External Trade Organization (JETRO) is a non-profit organization that provides business support services to companies expanding to Japan. With the support of JETRO's dedicated staff, clients can incorporate their business; receive visa, immigration, and HR support; find dedicated office space; identify local government subsidies; get...
DXWorldEXPO LLC announced today that All in Mobile, a mobile app development company from Poland, will exhibit at the 22nd International CloudEXPO | DXWorldEXPO. All In Mobile is a mobile app development company from Poland. Since 2014, they maintain passion for developing mobile applications for enterprises and startups worldwide.
@DevOpsSummit at Cloud Expo, taking place November 12-13 in New York City, NY, is co-located with 22nd international CloudEXPO | first international DXWorldEXPO and will feature technical sessions from a rock star conference faculty and the leading industry players in the world.
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
DXWorldEXPO LLC announced today that ICC-USA, a computer systems integrator and server manufacturing company focused on developing products and product appliances, will exhibit at the 22nd International CloudEXPO | DXWorldEXPO. DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City. ICC is a computer systems integrator and server manufacturing company focused on developing products and product appliances to meet a wide range of ...
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smart...
Headquartered in Plainsboro, NJ, Synametrics Technologies has provided IT professionals and computer systems developers since 1997. Based on the success of their initial product offerings (WinSQL and DeltaCopy), the company continues to create and hone innovative products that help its customers get more from their computer applications, databases and infrastructure. To date, over one million users around the world have chosen Synametrics solutions to help power their accelerated business or per...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.