Welcome!

Microsoft Cloud Authors: Pat Romanski, Liz McMillan, Lori MacVittie, Elizabeth White, Yeshim Deniz

Related Topics: Microsoft Cloud

Microsoft Cloud: Article

MSBuild - What It Does and What You Can Expect in the Future

The standard customizable build platform for the .NET Framework

What's New in MSBuild in .NET 3.5
.NET 3.5, which ships with Visual Studio 2008 code named "Orcas" brings one big feature we wanted from the start - support for multiple processors. Although this was a lot of work internally, it's easy for you to get the benefits. To switch on the feature, use the ‘/m' switch on msbuild.exe.

MSBuild parallelizes at the project level. To do this safely, MSBuild has to know which projects depend on the outputs of others. It can figure this out by using the "ProjectReference" items that Visual Studio puts in Visual Basic and C# projects. Before you enable a multiprocessor build, make sure you have declared all your dependencies using this tag so that MSBuild can order the build correctly. "ProjectReference" is an item type understood by the Visual Basic and C# build process we ship: if you have a custom build process you'll need to make sure that you implement something similar, using the MSBuild task.

Multiprocessor support required changes to our logging infrastructure, so if you have a custom logger you may want to read about them. If you use the built-in loggers you'll find we've improved the format with specific attention to diagnosing multiprocessor builds.

Another important new feature in .NET 3.5 is the support for multiple versions of the .NET Framework. MSBuild understands the concept of "Toolsets" defined in the configuration files and registry. These are sets of tasks, targets, and properties that can be switched in as a group. We added this to allow you to switch MSBuild .NET 3.5 between the .NET 3.5 tasks and targets and the tasks and targets that shipped with .NET 2.0. There's also a similar concept of "TargetFrameworkVersion" that lets you build assemblies specifi cally to target the 2.0, 3.0, or 3.5 .NET Framework. Take a look at the MSBuild team blog for more information about these.

A new element was added to the MSBuild file format - ItemDefinitionGroup. Using ItemDefinition-Group, you can specify default values for metadata on all items of an item type. For example, you can specify a default WarningLevel metadata value of "4" on items of type "Compile" item, and in your project fi les selectively override this with a different value.

In MSBuild in .NET 2.0, it wasn't possible to modify or remove items once they were created. To get around this it was necessary to copy items selectively from one item list to another using the CreateItem task, which was hard to read and caused an explosion of item types. A similar problem exists for properties. In .NET 3.5 ItemGroup and PropertyGroup can be used inside Targets, so these special tasks aren't needed anymore. There's also new syntax to remove items selectively from item lists and modify their metadata. In some cases this can make the build process much simpler to write.

You'll also see that you can now access registry keys directly in MSBuild using the new "$(Registry: [email protected])" syntax.

Finally you should see performance improvements in many scenarios even without multiprocessor support enabled. Incremental builds of relatively large projects have gotten signifi cantly faster in many cases.

Internal Adoption
"Dogfooding" is the strange name that Microsoft uses for using its own projects internally. We always try to use our own products extensively, starting well before they ship, to get feedback and fix issues before any customers encounter them. For example, you won't be surprised to learn that Microsoft's e-mail systems are always running on recent internal builds of Exchange Server. Right from the start we knew that "dogfooding" would be a top priority for MSBuild if we wanted to be a first-class build system. Early on we began an effort to convert the build of the Visual Studio product from nmake and build.exe to use msbuild.exe.

Visual Studio and the accompanying tools are written in several million lines of C++, C#, and VB and over the years its makefile-based build process had become dauntingly complex. The process of converting Visual Studio to use MSBuild took part of the team about two years, but as our tools and build process matured our progress got faster and faster.

The payoff was huge: we discovered and fixed numerous bugs and scalability problems, added some features, and learned a great deal - which will be fed right back into MSBuild. Our goal was to prove that MSBuild could scale to any customer project, and we believe we have achieved that.

Future Directions for MSBuild
You're probably wondering what to expect from future versions of MSBuild. Our plans aren't firm, so it's important to know that these aren't commitments, but a list of some of the features we'd like to ship at some point. We'd welcome information about what you'd like to see to help us prioritize our work: e-mail our team at [email protected].
•  Native build support. By far our biggest request. This includes the tasks and targets necessary to build Visual C++ projects on the command line and the Visual Studio support to load and modify them just as you can do today with Visual Basic and C# projects. Of course, we'd want to provide a conversion tool to migrate from today's Visual C++ format to MSBuild syntax projects.
•  Make creating tasks easier. We'd like to make it possible to create tasks without compiling - by writing script or XML directly in the targets file, with strongly typed inputs and outputs just like a regular task. These tasks can then be shared by copying and pasting. Extensible dependency analysis. MSBuild today can compare the timestamps of input and output fi les, but you must specify them explicitly. We'd like to discover these directly, even transitive dependencies. We'd also like to make it possible to plug in different kinds of dependency checking.
•  Improved multiprocessor performance both by speeding up the code path and improving our scheduling. Customers have also asked us to provide the option of automatically distributing builds over multiple machines.
•  Improved tooling around MSBuild. For example, better debugging, visualization, and analysis tools for large builds.
•  Move more project formats over to MSBuild. Visual C++, deployment, and Web projects should be based on MSBuild. Outside of Microsoft, we encourage the vendors of software built on the Microsoft platform to transition their project fi les and build process description if they're interested in the features that MSBuild offers.

Summary
We hope this article was a useful overview of where MSBuild is today and our plans for the future. For too long, build has been one of the least glamorous parts of the software development process, but it's our goal to make it exciting and provide developers with big productivity improvements too! We'd love to hear your feedback: we can be reached on the MSBuild forum, which you can reach from http://forums.microsoft.com/MSDN, or on our blog, which you can find at http://blogs.msdn.com/msbuild. If you prefer e-mail, our address is [email protected].

More Stories By Xin Yan

Xin Yan has been a software design engineer at Microsoft for over 7 years. He works on Visual Studio developer tools platform team.

More Stories By Dan Moseley

Dan Moseley is a software developer on the MSBuild and Visual Studio Project team.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Paul 11/28/07 04:17:42 PM EST

I enjoyed the article on MSBuild, well written, concise, and interesting. But I was most interested in reading about Batching. In that section, you refer to an example using the Culture metadata, but I don't see the example you're referring to. Am I missing something?

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...