Welcome!

Microsoft Cloud Authors: Elizabeth White, Yeshim Deniz, Serafima Al, Janakiram MSV, John Katrick

Related Topics: Microsoft Cloud

Microsoft Cloud: Article

Marissa's Guide to the .NET Garbage Collector

Marissa's Guide to the .NET Garbage Collector

"What's wrong, Uncle John?" I hadn't realized how my facial expressions were illustrating my inner feelings. I had been working on a new coding project, and as I worked I became more and more amazed by the native memory management provided by .NET. It seemed that almost by magic the runtime was able to figure out which objects were no longer needed and which should hang around, and - yet more amazing - it could even call special cleanup routines.

"Nothing is wrong, Marissa; I just wish I had time to dive into the automatic memory management features of .NET," I replied hastily. Honestly, what could a five-month-old baby possibly know about the intricacies of garbage collectors and memory management?

"Is that all?" she replied. "The .NET garbage collector is rather easy to understand; it is based on a generational collection algorithm. If you like, I can explain it to you - in fact I can do it in a way even you could understand," she stated rather confidently.

"What? Are you kidding? You're going to school me?" I wasn't about to walk away from this challenge. "Go for it, kid. Let's see what great mass of knowledge resides in the mind of a tiny, diaper-wearing, .NET internals architect."

Starting Off On the Right Foot
"Well, first you need to understand some of the assumptions made by the garbage collector's designers. I'll use the GC abbreviation going forward to refer to the garbage collector, like we did in the nursery before I came home." Marissa then went on to explain that there are three basic assumptions:

  • Recently created objects typically have a short lifetime: Consider the example of a database connection. You create the ADO.NET object, interact with the database, and then the object isn't typically needed while you interact with the data. This is an example of a pattern in which a new object doesn't need to hang around and take up precious resources.
  • Objects that are older typically have a longer lifetime: A good example of this is Windows Forms elements. A tree control that has been populated with information may be around for the lifetime of the application. In this pattern the use of memory is less costly than the cost of operation to create and populate the tree control.
  • Smaller collections yield better performance: The GC is based on a managed heap and it is faster to walk small sections of the heap than it is to walk the entire heap. So any operation on the heap that involves only a portion of the heap is going to be faster than an operation that involves the entire heap.

    "Well, Marissa, those are some good assumptions, but it really doesn't tell me much about how it all works," I said. She just frowned like babies do when they are frustrated and want you to just pay attention.

    "The next thing you need to understand is the basics of how objects are created and how memory is allocated. Then I can walk you through some of the deeper GC concepts."

    "Okay Marissa, it's your show; however you want to do this is fine." It's always a good idea to placate a baby if you can.

    She reviewed the object creation process with me and she was right: understanding how objects are created and how memory is allocated started to give me some insight into the internals of the GC. Rather than outline them here I've included her overview in Table 1.

    The important thing to keep in mind is that when you create a new object it results in a newobj IL instruction. This instruction will allocate and initialize memory. The initialization basically involves setting the object's initial state and is done by the constructor. But since the GC doesn't know about the state of your object it has no way to intelligently clean up your object. Marissa explained that the GC has some tricks to deal with intelligent cleanup, but it is really up to the programmer to implement these tricks. But as babies often do, she wanted to focus on the basics first.

    "When your applications create objects, Uncle John, do you find that they create a series of objects at once?" she asked.

    "Well, if you mean that objects created within the same scope usually have some type of strong relationship to each other, the answer would be yes."

    That was what she was asking. I learned that this natural pattern helps improve performance in the GC. As she explained earlier, "smaller collections yield better performance," so the fact that interrelated objects are near each other from a memory perspective means the GC can typically clean them all up at the same time. How this cleanup occurs, I learned, is based on something called generations.

    Coming to Terms with Generations
    "As you know, Uncle John, the GC is based on a generational approach." Marissa made her way to the whiteboard and started to draw some figures. "I've created a representation of an empty memory heap here (see Figure 1). When you create some new objects, they are added to what is internally known as "Generation 0". When the CLR is initialized it will set a size for Generation 0, 250KB for instance. So let's say some objects are created, since they are new, and all new objects are added to Generation 0, with the exception of objects larger then 85KB, which we will talk about later. The memory heap would look like this (see Figure 2)."

    I scratched my head. "Uh, go on." "Okay, suppose we create more objects. What do you think happens? Well, if the objects created cause a situation in which we exceed our 250KB size for Generation 0..."

    Interrupting, I asked, "So all the objects in Generation 0 can take up a total of 250KB or whatever the CLR initialized it to, right?" "Yes, that's right," she replied and went on without pausing, "so if we exceed the boundary, the GC will move any objects that are reachable by their root to Generation 1, which like Generation 0 has a memory size limit set by the CLR as well. This limit is about four times that of Generation 0. Once the objects are moved, the GC will then tear down any remaining objects in Generation 0, which will be left empty. The memory will be compacted and the new objects created."

    "Okay, okay. I think I'm starting to understand." I was getting excited - she was making sense. "Let me see if I can guess what happens next," I said. When another set of objects is created, the GC will see if Generation 0 has exceeded its size limit; if it has it will move the objects in Generation 0 to Generation 1, but also check to see if Generation 1 has exceeded its size limits. If moving the objects from Generation 0 to Generation 1 causes Generation 1 to exceed its limits, it will move reachable objects to Generation 2. Upon initialization the CLR also sets a size limit for Generation 2 , probably about twice that of Generation 1. Am I right?"

    Marissa nodded and then explained that there is no Generation three; the .NET garbage collector has only 3 generations and it is 0 based. She also explained that the GC is self tuning and that it can dynamically increase or decrease the size of each generation, which yields better performance. For instance, if the GC determines that you are using a high number of small short-lived objects, it might reduce the Generation 0 working set and free up resources for other areas. She also expanded on the issue with large objects, those greater then 85KB. Large objects are automatically created in Generation 2 to avoid the performance implications of starting in Generation 0 and immediately going over the size limit and repeating the process for Generation 1. Creating large objects in Generation 2 helps overall system performance. This is something I didn't know - what a smart little kid.

    Being Reachable
    "So Marissa, when you said the GC checks if the object is reachable, what did you mean?" She explained that the GC will build a graph of all reachable objects. A "reachable" object is one whose root - a storage location containing a memory pointer to a type - is not null. The assumption is made that if the object is pointing to something, it is being used. When the GC finds that an object is still being used it will not clean it up. On the other hand, objects that are not reachable can be destroyed by the GC to free up memory. When the GC is moving objects between generations, it will only move objects that are still reachable. A move is more expensive than releasing memory, so the fewer objects the GC has to move, the better. After a collection the GC assures that Generation 0 is empty and all surviving objects live in either Generation 1 or 2.

    Finalizing Objects
    Marissa highlighted some of the implications of the GC that are important to software developers. Since the GC has no true knowledge of your object it is important to understand the use of the Finalize and Dispose methods and how they affect the GC and your code.

    First, let's realize that there are no deterministic destructors in .NET. It is important to understand this because otherwise you can make false assumptions about how your system behaves. The closest equivalents are Finalize and Dispose. The Finalize method is called by the GC and the Dispose method is designed to be called programmatically.

    When a new object is created, objects containing a Finalize method get a special mention in what is known as the finalization list. This list contains pointers to all objects that have a Finalize method. By inspecting this table the GC can determine which objects to call Finalize on and which should simply be deleted. The process to call Finalize is expensive; therefore, you should use Finalize with care. During the first pass, the GC will look at the heap and determine which objects are not reachable; it will then review the finalization list and if it finds that one of the nonreachable objects has a Finalize method it will copy the pointer from the finalization list to what is known as the "freachable queue". The nonreachable object will then be moved to the next generation. As stated earlier, Generation 0 always gets emptied. During the next pass the GC will find that the object is not reachable and then will execute the finalize method from the freachable queue.

    The key concept to take away from this is that you do not know when the GC will call the Finalize method. It is possible that if you are using unmanaged or expensive resources they could be around for much longer then you expect. Also, the creation of objects with Finalize methods take a little longer since they are not only placed on the heap, but also require a pointer to be established on the finalization list structure. You should also consider the situation in which an object that has a Finalize method has references to other objects. The GC will not clean up these other objects until after the object with the Finalize method is cleaned up, so it is important to consider downstream object designs when you are implementing systems that use objects with Finalize. In fact, if you are seeing performance or resource issues and cannot understand why something is still referenced, be sure that a parent object doesn't have a Finalize method that is keeping the resource alive. Here is a little pop quiz for you to see if you are getting Marissa's message.

    Years ago I worked on the Argo project, which involved a system that could represent undersea long distance phone networks. A network wrapping around Africa could involve thousands and thousands of network nodes. Each type of network node was represented as an object and stored in an array. Consider what would happen under .NET and the GC if we populated a structure of some sort with say, 750 objects in a new version of Argo and each object had a Finalize method. What do you think would happen when the GC kicked in? Imagine that all the objects were very small and that even with 750 objects we didn't exceed the Generation 0 size limit. In this situation the GC would need to carry out the finalization semantics for all 750 objects. If you guessed that it would allocate 750 pointers in the finalization list, copy 750 pointers to the freachable queue, move 750 objects to Generation 1, and then call 750 Finalize methods, you guessed right. Now consider what that would mean to performance...

    Nap Time
    Marissa is starting to interject loud baby wails into her talk on the GC, so I think it is time for her to have a nap. She did promise to explain GC threading implications, strong and weak object references, the use of the Dispose method, object resurrection, programming the GC, and more on the Finalize method in Part 2 of this article.

  • More Stories By John Gomez

    John Gomez, open source editor for .NET Developer's Journal, has over 25 years of software development and architectural experience, and is considered a leader in the design of highly distributed transaction systems. His interests include chaos- and fuzzy-based systems, self-healing and self-reliant systems, and offensive security technologies, as well as artificial intelligence. John started developing software at age 9 and is currently the CTO of Eclipsys Corporation, a worldwide leader in hospital and physician information systems.

    Comments (2)

    Share your thoughts on this story.

    Add your comment
    You must be signed in to add a comment. Sign-in | Register

    In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


    @ThingsExpo Stories
    A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
    Imagine if you will, a retail floor so densely packed with sensors that they can pick up the movements of insects scurrying across a store aisle. Or a component of a piece of factory equipment so well-instrumented that its digital twin provides resolution down to the micrometer.
    In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settle...
    Product connectivity goes hand and hand these days with increased use of personal data. New IoT devices are becoming more personalized than ever before. In his session at 22nd Cloud Expo | DXWorld Expo, Nicolas Fierro, CEO of MIMIR Blockchain Solutions, will discuss how in order to protect your data and privacy, IoT applications need to embrace Blockchain technology for a new level of product security never before seen - or needed.
    BnkToTheFuture.com is the largest online investment platform for investing in FinTech, Bitcoin and Blockchain companies. We believe the future of finance looks very different from the past and we aim to invest and provide trading opportunities for qualifying investors that want to build a portfolio in the sector in compliance with international financial regulations.
    Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
    Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
    No hype cycles or predictions of a gazillion things here. IoT is here. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, an Associate Partner of Analytics, IoT & Cybersecurity at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He also discussed the evaluation of communication standards and IoT messaging protocols, data...
    Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
    In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
    "IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
    When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
    Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
    We are given a desktop platform with Java 8 or Java 9 installed and seek to find a way to deploy high-performance Java applications that use Java 3D and/or Jogl without having to run an installer. We are subject to the constraint that the applications be signed and deployed so that they can be run in a trusted environment (i.e., outside of the sandbox). Further, we seek to do this in a way that does not depend on bundling a JRE with our applications, as this makes downloads and installations rat...
    Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
    DX World EXPO, LLC, a Lighthouse Point, Florida-based startup trade show producer and the creator of "DXWorldEXPO® - Digital Transformation Conference & Expo" has announced its executive management team. The team is headed by Levent Selamoglu, who has been named CEO. "Now is the time for a truly global DX event, to bring together the leading minds from the technology world in a conversation about Digital Transformation," he said in making the announcement.
    In this strange new world where more and more power is drawn from business technology, companies are effectively straddling two paths on the road to innovation and transformation into digital enterprises. The first path is the heritage trail – with “legacy” technology forming the background. Here, extant technologies are transformed by core IT teams to provide more API-driven approaches. Legacy systems can restrict companies that are transitioning into digital enterprises. To truly become a lead...
    Digital Transformation (DX) is not a "one-size-fits all" strategy. Each organization needs to develop its own unique, long-term DX plan. It must do so by realizing that we now live in a data-driven age, and that technologies such as Cloud Computing, Big Data, the IoT, Cognitive Computing, and Blockchain are only tools. In her general session at 21st Cloud Expo, Rebecca Wanta explained how the strategy must focus on DX and include a commitment from top management to create great IT jobs, monitor ...
    "Cloud Academy is an enterprise training platform for the cloud, specifically public clouds. We offer guided learning experiences on AWS, Azure, Google Cloud and all the surrounding methodologies and technologies that you need to know and your teams need to know in order to leverage the full benefits of the cloud," explained Alex Brower, VP of Marketing at Cloud Academy, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clar...
    The IoT Will Grow: In what might be the most obvious prediction of the decade, the IoT will continue to expand next year, with more and more devices coming online every single day. What isn’t so obvious about this prediction: where that growth will occur. The retail, healthcare, and industrial/supply chain industries will likely see the greatest growth. Forrester Research has predicted the IoT will become “the backbone” of customer value as it continues to grow. It is no surprise that retail is ...