Welcome!

Microsoft Cloud Authors: Elizabeth White, Mihai Corbuleac, Pat Romanski, David Bermingham, Steven Mandel

Related Topics: Microservices Expo, Java IoT, PowerBuilder, Microsoft Cloud, Agile Computing, @CloudExpo, Apache

Microservices Expo: Article

The Importance of Accurately Modeling User Interactions in Performance Testing

Take a closer look at the factors that go into creating a realistic load that will yield more accurate results

Load testing, perhaps more than any other form of testing, is one of those activities that you either choose to do well or risk a result that leaves you worse off than not doing it at all. Half-hearted attempts at load testing yield "results," but too often those results are inaccurate, leading to a false sense of security for anyone who trusts them. This, in turn, leads to the release of applications that are not adequately tested and that experience performance problems soon after entering production.

I was reminded of this not long ago, when I worked with a customer who related an experience that may sound familiar to many of you. This customer was a test engineer for a bank that had recently merged with another bank, effectively doubling their customer base. He was part of a team responsible for load testing a new web application that would serve customers from both of the original banks. Before the application was rolled out, they performed load tests and confirmed that the application could handle the expected number of users with acceptable response times. When the system went live, however, it was slow as molasses - even under user loads less than what the team had tested.

The problem, as you may have guessed, was that the team had not accurately modeled the load. The virtual users used in the testing were a homogenous group that interacted with the system in roughly the same way, from roughly the same geographic locations, at the same network speed. In reality, the customers who came from Bank A tended to perform certain transactions much more frequently than those who came from Bank B. Most of Bank B's customers lived in a different part of the country than those from Bank A. More important, customers from both banks were accessing the application at widely differing connection speeds across a range of browsers. None of these factors were modeled accurately in the load tests the team had performed. In some cases it was because the team simply had not considered them, in others it was because the load testing tool they were using provided no way to handle these differences. In either case, the result was the same; the team had given the "go live" signal to an application that was not ready, basing their decision on inaccurate load test results.

Too often, organizations take a short cut to load testing. They are focused on a single number: how many concurrent users their application will support. As a result they put little effort into script development, and they end up with an unrealistic test - one of little value. I encourage all load testers to think beyond the concurrent users metric and take a closer look at other factors that go into creating a realistic load that will yield more accurate results, including:

  • Modeling user activity
  • Modeling different connection speeds
  • Modeling different browsers and mobile devices
  • Modeling geographically distributed users

Parameterizing Scripts to Better Model User Activity
Scripts that simply record a typical user's interaction with a web application and then play it back are not going to yield accurate performance data. As an example, a script that emulates a user logging into a site, searching for a product, placing it in the cart, and checking out does little to test the performance of other user activities such as checking product reviews, accessing detailed specifications, or comparing products.

More important, if the script always logs in as the same user and orders the same product, caching effects will often skew the performance measurements, making response times shorter than they would be under a real-world load. Caching on the web server, application server, and database server all come into play, compounding any caching that is done on the client side.

To minimize caching and similar effects, scripts must be parameterized. In my example above, the script would play back different users searching for different products, and purchasing them via different methods. Ideally the script would use randomization or data customization to fill in every user editable or selectable element on each form of the web application. This script parameterization, combined with creating multiple scripts to address a variety of user interactions, produces a much more realistic user load, and it's a good idea to have a load testing tool that simplifies these tasks.

Generating a Load with a Mixture of Connection Speeds and Network Characteristics
Many testing teams use the fastest available network connections when load testing a server. The belief is that if the application performs well under those connections, it will be guaranteed to work well in production when many real-world users will have slower connections. This is a faulty assumption that leads to performance problems when the application is subjected to real-world users accessing it at a variety of network bandwidths.

Testing with only high-speed connections can mask performance problems that occur only when lower speed connections are used. Slower data speeds will require connections to the server to stay open longer, and eventually the server may reach its limit for the maximum number of open connections.

Of course, testing with only low-speed connections is equally problematic. What's needed is a reasonable mixture of virtual users accessing the server at connection speeds representative of everything from 56K modems for dial-up users to T3 lines.

With more and more users accessing the web via mobile devices, it makes sense to include 3G and 4G connection rates in the mix as well. It's also important to take into account disparities in signal strength that can cause packet loss and increased network latency. Built-in support for incorporating these factors in performance testing is increasingly important, particularly for web applications that serve a high percentage of mobile users.

Emulating Different Browsers and Native Mobile Apps
Interestingly enough (and often surprising to some), not all browsers support the same number of concurrent HTTP connections. This obviously needs to be thought of as well - if a load test models the entire user population accessing a web application with a single browser that supports four connections per server, it neglects the effects of browsers that use twice that number.

This leads to a situation similar to the one that arises with inaccurate modeling of connection speeds - with more concurrent connections, it is not unusual to see slowdowns as a server reaches its limit for simultaneous connections. To minimize these effects, load tests should apply a variety of browser profiles during playback, so that the tests identify the traffic as originating from a realistic mixture of different browsers, including mobile browsers.

Mobile devices, in fact, present a new set of challenges for load testers (see Best Practices for Load Testing Mobile Applications, Part 1 and Best Practices for Load Testing Mobile Applications, Part 2), aside from the network connection issues I've already covered. Many companies now have a separate mobile version of their site, with content tailored specifically for mobile users. Again, to perform a valid load test on such sites, a test engineer must be able to override the browser identification during playback so that the virtual user appears to be using a mobile browser.

What about native mobile applications? There is no browser involved, so you'll need a testing solution that can record, parameterize, and play back the network traffic originating from the mobile device. For some cases this can be done via a proxy, but for some apps this is not an available option. These apps may call for a tunneling approach in which the testing tool acts as a DNS server. Even if you're not facing this situation today, you may want to see if your testing tool supports this feature so you're prepared when you do need it.

Generating a Geographically Distributed Load
Unless your end-user community is accessing your application from a single location, initiating tests solely from inside your datacenter is unlikely to represent a realistic load. Such tests fail to take into account the effects of third-party servers and content delivery networks that may sit between your users and your web application.

Using the cloud to generate load as part of your testing can better model a geographically distributed user base, one that may include users from around the world, enabling test engineers to generate realistic, large-scale tests across multiple regions. Cloud testing complements internal, lab-based tests and ideally test scripts from one domain are reused in the other. With separate performance metrics for each geographic region in hand, engineers can see where performance issues are likely to arise on a region-by-region basis.

If users are accessing your web site from all over the world, load testing from the cloud helps you model that reality. When this capability is combined with tests that incorporate parameterized scripts, browser differences, support for mobile apps, and a variety of connection speeds and network effects, you can trust the accuracy of your test results.

More Stories By Steve Weisfeldt

Steve Weisfeldt is a Senior Performance Engineer at Neotys, a provider of load testing software for Web applications. Previously, he has worked as the President of Engine 1 Consulting, a services firm specializing in all facets of test automation. Prior to his involvement at Engine 1 Consulting, he was a Senior Systems Engineer at Aternity. Prior to that, Steve spent seven years at automated testing vendor Segue Software (acquired by Borland). While spending most of his time at Segue delivering professional services and training, he was also involved in pre-sales and product marketing efforts.

Being in the load and performance testing space since 1999, Steve has been involved in load and performance testing projects of all sizes, in industries that span the retail, financial services, insurance and manufacturing sectors. His expertise lies in enabling organizations to optimize their ability to develop, test and launch high-quality applications efficiently, on-time and on-budget. Steve graduated from the University of Massachusetts-Lowell with a BS in Electrical Engineering and an MS in Computer Engineering.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
A critical component of any IoT project is the back-end systems that capture data from remote IoT devices and structure it in a way to answer useful questions. Traditional data warehouse and analytical systems are mature technologies that can be used to handle large data sets, but they are not well suited to many IoT-scale products and the need for real-time insights. At Fuze, we have developed a backend platform as part of our mobility-oriented cloud service that uses Big Data-based approache...
trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vice president of product management, IoT solutions at GlobalSign, will teach IoT developers how t...
Digital payments using wearable devices such as smart watches, fitness trackers, and payment wristbands are an increasing area of focus for industry participants, and consumer acceptance from early trials and deployments has encouraged some of the biggest names in technology and banking to continue their push to drive growth in this nascent market. Wearable payment systems may utilize near field communication (NFC), radio frequency identification (RFID), or quick response (QR) codes and barcodes...
SYS-CON Events announced today that Peak 10, Inc., a national IT infrastructure and cloud services provider, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Peak 10 provides reliable, tailored data center and network services, cloud and managed services. Its solutions are designed to scale and adapt to customers’ changing business needs, enabling them to lower costs, improve performance and focus inter...
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
The demand for organizations to expand their infrastructure to multiple IT environments like the cloud, on-premise, mobile, bring your own device (BYOD) and the Internet of Things (IoT) continues to grow. As this hybrid infrastructure increases, the challenge to monitor the security of these systems increases in volume and complexity. In his session at 18th Cloud Expo, Stephen Coty, Chief Security Evangelist at Alert Logic, will show how properly configured and managed security architecture can...
There is an ever-growing explosion of new devices that are connected to the Internet using “cloud” solutions. This rapid growth is creating a massive new demand for efficient access to data. And it’s not just about connecting to that data anymore. This new demand is bringing new issues and challenges and it is important for companies to scale for the coming growth. And with that scaling comes the need for greater security, gathering and data analysis, storage, connectivity and, of course, the...
The IETF draft standard for M2M certificates is a security solution specifically designed for the demanding needs of IoT/M2M applications. In his session at @ThingsExpo, Brian Romansky, VP of Strategic Technology at TrustPoint Innovation, will explain how M2M certificates can efficiently enable confidentiality, integrity, and authenticity on highly constrained devices.
So, you bought into the current machine learning craze and went on to collect millions/billions of records from this promising new data source. Now, what do you do with them? Too often, the abundance of data quickly turns into an abundance of problems. How do you extract that "magic essence" from your data without falling into the common pitfalls? In her session at @ThingsExpo, Natalia Ponomareva, Software Engineer at Google, will provide tips on how to be successful in large scale machine lear...
The IoTs will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm and share the must-have mindsets for removing complexity from the development proc...
Increasing IoT connectivity is forcing enterprises to find elegant solutions to organize and visualize all incoming data from these connected devices with re-configurable dashboard widgets to effectively allow rapid decision-making for everything from immediate actions in tactical situations to strategic analysis and reporting. In his session at 18th Cloud Expo, Shikhir Singh, Senior Developer Relations Manager at Sencha, will discuss how to create HTML5 dashboards that interact with IoT devic...
Artificial Intelligence has the potential to massively disrupt IoT. In his session at 18th Cloud Expo, AJ Abdallat, CEO of Beyond AI, will discuss what the five main drivers are in Artificial Intelligence that could shape the future of the Internet of Things. AJ Abdallat is CEO of Beyond AI. He has over 20 years of management experience in the fields of artificial intelligence, sensors, instruments, devices and software for telecommunications, life sciences, environmental monitoring, process...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
SYS-CON Events announced today that Ericsson has been named “Gold Sponsor” of SYS-CON's @ThingsExpo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. Ericsson is a world leader in the rapidly changing environment of communications technology – providing equipment, software and services to enable transformation through mobility. Some 40 percent of global mobile traffic runs through networks we have supplied. More than 1 billion subscribers around the world re...
We’ve worked with dozens of early adopters across numerous industries and will debunk common misperceptions, which starts with understanding that many of the connected products we’ll use over the next 5 years are already products, they’re just not yet connected. With an IoT product, time-in-market provides much more essential feedback than ever before. Innovation comes from what you do with the data that the connected product provides in order to enhance the customer experience and optimize busi...
In his session at @ThingsExpo, Chris Klein, CEO and Co-founder of Rachio, will discuss next generation communities that are using IoT to create more sustainable, intelligent communities. One example is Sterling Ranch, a 10,000 home development that – with the help of Siemens – will integrate IoT technology into the community to provide residents with energy and water savings as well as intelligent security. Everything from stop lights to sprinkler systems to building infrastructures will run ef...
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, will discuss how leveraging the Industrial Interne...
The increasing popularity of the Internet of Things necessitates that our physical and cognitive relationship with wearable technology will change rapidly in the near future. This advent means logging has become a thing of the past. Before, it was on us to track our own data, but now that data is automatically available. What does this mean for mHealth and the "connected" body? In her session at @ThingsExpo, Lisa Calkins, CEO and co-founder of Amadeus Consulting, will discuss the impact of wea...
Whether your IoT service is connecting cars, homes, appliances, wearable, cameras or other devices, one question hangs in the balance – how do you actually make money from this service? The ability to turn your IoT service into profit requires the ability to create a monetization strategy that is flexible, scalable and working for you in real-time. It must be a transparent, smoothly implemented strategy that all stakeholders – from customers to the board – will be able to understand and comprehe...
You deployed your app with the Bluemix PaaS and it's gaining some serious traction, so it's time to make some tweaks. Did you design your application in a way that it can scale in the cloud? Were you even thinking about the cloud when you built the app? If not, chances are your app is going to break. Check out this webcast to learn various techniques for designing applications that will scale successfully in Bluemix, for the confidence you need to take your apps to the next level and beyond.