Microsoft Cloud Authors: Pat Romanski, Liz McMillan, Lori MacVittie, Elizabeth White, Yeshim Deniz

Related Topics: Microsoft Cloud

Microsoft Cloud: Article

MSBuild - What It Does and What You Can Expect in the Future

The standard customizable build platform for the .NET Framework

The final XML element we'll cover on this tour is Import, and it's the key to making your build process reuseable. It pulls in another MSBuild file. Right at the bottom of your Visual Basic and C# files, you'll see an Import tag that pulls in the "Microsoft. CSharp.targets" or "Microsoft.VisualBasic.targets" file, which is the root of the build process for these projects. Both targets files again import the "Microsoft.Common.targets" file. There's more to MSBuild syntax, which is covered in the online documentation, but this overview should give you an idea.

Traditionally build processes log to the console and to log files. The information logged is fairly hard-coded: the folder that's currently building, the command lines executing, and their output, including any warnings and errors. MSBuild was designed to log in a much richer fashion. Strongly typed .NET events are fired at each step, and any number of "loggers" can subscribe to the events they want and do whatever they like with the information. Events include "ProjectStarted" and "Message", which can be emitted by a <Message> task in the build. Output from tools the build launches is logged as Message, Error or Warning events as appropriate.

The build process is abstracted from the loggers, which are chosen when the build starts. For example, a builder might simultaneously attach a console logger, several file loggers with different verbosity settings, and a logger that inserts event details in a database. MSBuild ships with a built-in console logger, which is attached by default, and a file logger, both of which have adjustable verbosity settings. Like tasks, loggers are just .NET classes and you can write your own - start by deriving from the Logger class.

How MSBuild Builds a Project
MSBuild can be invoked directly on the command line (msbuild.exe) or inside Visual Studio. Either way the build is handled by the same MSBuild engine. Here's what MSBuild does:

1.  First, any global properties are applied to the project. Global properties are special: they can't be modified except within a Target. If MSBuild is invoked from the command line, the set of global properties will be the ones passed in via the /property: command line switch. If MSBuild is invoked as part of Visual Studio, the active configuration and platform are the global properties.
2.  Starting from the beginning of the project file, MSBuild engine evaluates the properties that are defined in the project. Imported targets files are loaded and evaluated as the MSBuild engine progresses: think of them as inserted directly into the project file. If a property is defined in two or more places, the last one wins.
3.  After the properties are evaluated, MSBuild engine evaluates all the item definitions that are defined then makes a third pass to evaluate items. At every step of the evaluation, all previously evaluated property values and item values an be used.
4.  MSBuild starts to execute the entry-point targets. These targets can be specifi ed via the /target: command line switch or the default targets defi ned in the project file through the project element's DefaultTargets attribute, or else MSBuild just picks the first target Visual Studio's entry points are fixed: Build, Clean, Rebuild, or Publish.

If the condition on a target is true, MSBuild will make sure all its dependencies are executed then execute the target itself, invoking the tasks in it one by one.
5.  By using the special "MSBuild" task, a project can cause another project to build. If a project references the outputs of another, it will use the MSBuild task to make sure that project is up-todate. Once the referenced project is done building, the MSBuild engine will resume building the referencing project.

To see what's happening in the build, specify the "detailed" or "diagnostic" verbosity level. To do this from the command line, use the /verbosity: switch to toggle the amount of information displayed during the build. In Visual Studio, this can be customized in the Tools/Options menu. On the "Build and Run" node under "Project and Solutions" from the left pane, you'll see a dropdown for "MSBuild project build output verbosity."

MSBuild can divide item lists into different "batches" using their metadata then execute each batch one at a time. This can be confusing, but once you get used to the idea it feels natural - and powerful.

Here's an example that creates a different folder for each value of the "Culture" metadata on an item list. You'll see that the Culture metadata, referenced with the special metadata ‘%' syntax, is passed into the task: that tells MSBuild to run the task once for each unique Culture value. In this example, the task runs twice.

    <Resource Include="Form1.en-CA.resx;Form2.en-CA.resx">
    <Resource Include="Form1.en-NZ.resx">

<Target Name="CreateCultureDirs">
        <MakeDir  Directories="%(Resource.Culture)"/>

Here's a trick that sometimes comes in handy: to execute a task once for each item, you can use the well-known item metadata Identity, which usually has a different value for each item.

Hosting the MSBuild Engine
If you like, you can host the MSBuild engine in your application, just as Visual Studio does. By referencing the Microsoft.Build.BuildEngine assembly, you can load, edit, and build projects. Perhaps you're writing a tool for analyzing a build tree, or even a product like Visual Studio that you want to read and write MSBuild format projects.

As we mentioned, not all Microsoft project types are in MSBuild format right now. In Visual Studio 2005, Visual C++ projects are in their own format, as are Visual Studio Deployment projects and ASP.NET Web site projects. Visual Studio solution fi les are still in their own format too. Of course, most of us have some of these projects, and we need to build them on the command line too.

MSBuild does its best to interoperate with them. MSBuild can read Visual Studio solution files by translating them internally into MSBuild format. When a Visual C++ project is encountered, MSBuild will try to build it with vcbuild.exe, which ships as part of Visual Studio and the .NET SDK. For ASP. NET Web site projects, MSBuild will launch a tool to pre-compile them. In both cases, MSBuild generally won't require Visual Studio installed to build them. Indeed Visual Studio Team System's Build Server feature is designed to use MSBuild to work on dedicated build machines that don't have the full Visual Studio product installed.

The interoperability isn't perfect. As you'd expect, the different formats can't work together exactly. We hope MSBuild handles them well enough for most people to get by until they are all based on MSBuild. If you don't get the results you need, the fallback is to have Visual Studio installed and build with Visual Studio's devenv.exe application on the command line.

More Stories By Xin Yan

Xin Yan has been a software design engineer at Microsoft for over 7 years. He works on Visual Studio developer tools platform team.

More Stories By Dan Moseley

Dan Moseley is a software developer on the MSBuild and Visual Studio Project team.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

Most Recent Comments
Paul 11/28/07 04:17:42 PM EST

I enjoyed the article on MSBuild, well written, concise, and interesting. But I was most interested in reading about Batching. In that section, you refer to an example using the Culture metadata, but I don't see the example you're referring to. Am I missing something?

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...