Microsoft Cloud Authors: Elizabeth White, Yeshim Deniz, Serafima Al, Janakiram MSV, John Katrick

Related Topics: Microsoft Cloud

Microsoft Cloud: Article


The Microsoft Dynamic Systems Initiative and Enterprise Architecture

With Whitehorse, Microsoft has placed a significant stake in the ground when it comes to modeling enterprise services. While Whitehorse is part of the not-yet-released Visual Studio 2005 (codenamed "Whidbey"), Microsoft has publicly discussed and demonstrated significant elements of Whitehorse, and alpha code is currently in use by select Microsoft customers. This article will discuss key Whitehorse concepts and capabilities that have been previously announced, and will put them into context with information on key forces, such as the Microsoft Dynamic Systems Initiative and the emerging discipline of enterprise architecture, that are driving enterprises to embrace modeling as they move towards service-oriented architecture (SOA)–based development.

What Is Whitehorse?
To quote from the Microsoft Whitehorse FAQ (whitehorsefaq.aspx), Whitehorse "is a feature of Visual Studio Whidbey that simplifies the architecture, design, and development of applications comprised of distributed services. The service-oriented application designer consists of a number of tools, including the distributed services designer, which enables architects to design their application architecture visually. Developers can work with code generated from this tool and keep code changes synchronized with the visual design. Additionally, the logical system architecture designer allows infrastructure architects to visually model the data center, map the application to the data center, and validate it against the constraints of the application/data center prior to actual deployment. Reports generated from this help document the deployment mapping."

Microsoft is making three key points in this statement; let's inspect each one in turn to understand its relevance to enterprise development and operations teams. First, Whitehorse introduces a distributed services designer, that enables architects to design their application architecture visually. Why is this significant? With the huge industry trend towards Web services and SOAs, including Microsoft's currently available Web Services Enhancements (WSE) and future Indigo capabilities, the Visual Studio team has recognized the importance of visually managing service definition and development through a modeling environment tightly linked with underlying code generation.

Second, Whitehorse provides a logical system architecture designer that allows infrastructure architects to visually model the data center. While tools exist to manage IT assets (servers, network hardware, and the like), Whitehorse enables operations team members to document the existence of these assets and to describe their operational characteristics, such as security settings, deployable component types allowed, etc.

Finally, Whitehorse allows application designers to validate application architectures against targeted data center deployments. While other Whitehorse features are important, this one is perhaps the most significant in the application development/deployment life cycle. By allowing developers to validate their application structure during the design phase, Whitehorse helps development and operations staffs effectively communicate with each other, avoiding the "big oops" that often occurs when a developed application that has been fully tested in the development/QA environment cannot be deployed into the operational environment or performs so poorly (e.g., speed, application stability) that its deployment is impractical.

DSI and Enterprise Architecture
Some of you may have heard of Microsoft's Dynamic Systems Initiative (DSI) and may also be familiar with the emerging field of enterprise architecture. Where does Whitehorse fit within these concepts? Whitehorse is one of the primary tools within Microsoft's Distributed Systems Initiative and, as such, will be one of the key ways that DSI is exposed to the enterprise developer and operations communities. DSI, in turn, expresses some key enterprise architecture concepts in concrete terms for Windows- and .NET-based application architectures. Let's explore DSI and enterprise architecture to provide a context for the value of Whitehorse within an enterprise IT environment.

Dynamic Systems Initiative and the System Definition Model
Quoting again from Microsoft's Whitehorse FAQ, the Microsoft Dynamic Systems Initiative "is a broad Microsoft and industry initiative uniting hardware, software and service vendors around a new software architecture based on the System Definition Model (SDM). This new architecture is becoming the focal point for how we are making product investments to dramatically simplify and automate how our customers will develop, deploy, and operate applications and IT infrastructure. The System Definition Model (SDM) is a live Extensible Markup Language (XML) blueprint that spans the IT life cycle and unifies IT operational policies with the operational requirements of applications. It is relevant at both design time and at run time. At design time, it will be exposed through Visual Studio to enable IT operators to capture their policies in software and developers to describe application operational requirements. At deployment time, the SDM description of the application will enable the operating system to automatically deploy the complete application and dynamically allocate a set of distributed server, storage, and networking resources that the application requires."

In essence, the SDM provides a single consolidation point that gives application developers a precise and concise way to document the operational needs of a distributed system. Once documented in this way, an SDM instance can be used to communicate those operational needs to the IT staff responsible for defining and maintaining the organization's operational IT infrastructure. Tools such as Whitehorse will be used both to generate SDM document instances, which represent a distributed application's operational requirements, and to validate those requirements against a candidate operational deployment topology.

Enterprise Architecture
The bridging of application to operational needs is one aspect of an emerging IT discipline called enterprise architecture. One definition states that enterprise architecture "provides, on various architecture abstraction levels, a coherent set of models, principles, guidelines, and policies, used for the translation, alignment, and evolution of the systems that exist within the scope and context of an Enterprise." (www.geao.org/aboutea/definition.jsp)

A concrete example of an enterprise architecture is the Federal Enterprise Architecture (FEA), an initiative driven by the federal government to support and encourage cross-agency collaboration, transformation, and government-wide productivity improvements. While many details of the FEA are relevant only to government activities, the FEA also lays out a useful architectural structure that is increasingly being used by other enterprises to scope and manage their architectural activities. The FEA is composed of the following five layers:

  • Performance Reference Model (PRM): Framework to measure the performance of major IT investments and their contribution to program performance
  • Business Reference Model (BRM): Function-driven framework for describing the business operations of the federal government, independent of the agencies that perform them
  • Service Component Reference Model (SRM): Business- and performance-driven, functional framework that classifies service components with respect to how they support business and/or performance objectives
  • Data and Information Reference Model (DRM): Model that describes, at an aggregate level, the data and information that support program and business line operations
  • Technical Reference Model (TRM): Component-driven, technical framework used to identify the standards, specifications, and technologies that support and enable the delivery of service components and capabilities
When applied to application development, this architectural framework can be used to describe the entire application project life cycle, from identifying the initial business need (PRM) to specifying the new processes and functions (and modifications to existing processes and functions) required to meet that business need (BRM), to defining the functional services and data elements that must be implemented to support the application meeting the business need (SRM and DRM), to specifying how the services and application components consuming those services will be deployed on the organization's IT infrastructure (TRM). Whitehorse and the DSI fit directly within this architectural framework, addressing the SRM and TRM layers and, more specifically, the definition and binding of application services specified as part of an enterprise's SRM to server and network infrastructure specified by the enterprise's TRM.

You may now be asking yourself, "Why are all of these models necessary? Why can't I just create services and deploy them as I need them?" While development tools, such as Whitehorse, will certainly make it much easier to create and deploy services, it's important to keep sight of the big picture. Managing your services to prevent functional redundancy and to make sure that you are building the right services at the right time is a big part of what an enterprise architecture is designed to do. In fact, most enterprise architects will recommend that a repository be used to manage and make searchable an organization's business processes, service definitions, and deployed services instances and their interrelationships. That said, incremental definition, development, and deployment of business processes and their supporting services within an enterprise architecture is clearly the way to make progress in moving from the "what is" state to the "what should be" state, as specified by the architecture. Whitehorse gives developers a highly effective toolset to do just that, selecting existing and building new services that are subsequently combined into applications designed to support new and changing business processes.

A Whitehorse Working Scenario
The remainder of this article will show how Whitehorse can be used to enable application design and deployment to a targeted operational infrastructure in three steps:

  • Application design
  • Operational infrastructure design
  • Application validation against operational infrastructure
Application Design
We have decided that our example application, a typical e-commerce site, will be built using the Microsoft Enterprise Solution Pattern, "Three-Layered Services Application" (Click Here !), as an architectural guideline. This pattern specifies that applications should be composed of three layers: UI Components, which present business functionality to application users; Service Interfaces, which expose consolidated business functionality (driven perhaps from SRM definitions extracted from our enterprise architecture); and Data Access Components, which encapsulate and present data managed by relational databases. Whitehorse allows us to define our components and services, retrieving and importing existing service definitions where they exist and specifying new service definitions as needed. We can then use the Whitehorse visual modeling surface to wire our application services and components together, as shown in Figure 1. (Figures in this article were extracted from the Microsoft DSI Overview whitepaper available at Click Here !.).

Operational Infrastructure Design
In parallel with our application design efforts, our IT operations staff, using the principles defined by the Microsoft Enterprise Solution Pattern, "Deployment Plan," defines the operational (i.e., data center) server/network topology against which this application (and all other enterprise applications) must be deployed.

Application Validation
Once we have the data center structure in hand, we can apply our designed application services and components to the operational server/network topology, using Whitehorse's drag-and-drop capabilities. This activity can be described as a four-step process:

  • Import operational topology definitions into our Whitehorse project
  • Drag and drop our application components and services onto server instances defined by the imported operational topology
  • Validate the compatibility between application elements and server instances
  • Reconcile incompatibilities by modifying application component and service requirements and/or data center topology definitions
Some example constraints that might be flagged by Whitehorse (and that we will subsequently need to resolve) include:
  • Only anonymous access will be supported on a specific server, so we can't deploy an application or service that requires user authentication on that server.
  • A server is configured to prevent Web services from running, so we can't deploy a Web service on that server.
Once we have successfully reconciled all incompatibilities, we can continue with our application implementation and deployment with the knowledge that our application is designed to be compatible with our organization's operational infrastructure.

Whitehorse is a major advance in .NET development tooling, designed to both enable application developers to rapidly define applications and their constituent services and components, and to enable development and operations staff to better communicate systems-oriented requirements and infrastructure dependencies early in the application development life cycle. For more information about Whitehorse and other topics discussed in this article, see the following sites:

  • MSDN TV session on Whitehorse: TV Session
  • Dynamic Systems Initiative: www.microsoft.com/windowsserversystem/dsi/dsioverview.mspx
  • Enterprise architecture: www.eacommunity.com,
  • Federal Enterprise Architecture: www.feapmo.gov.
  • More Stories By Brent Carlson

    Brent Carlson is vice president of technology and cofounder of LogicLibrary, a provider of software development asset (SDA) management tools. He is the coauthor of two books: San Francisco Design Patterns: Blueprints for Business Software (with James Carey and Tim Graser) and Framework Process Patterns: Lessons Learned Developing Application Frameworks (with James Carey). He also holds 16 software patents, with eight more currently under evaluation.

    Comments (0)

    Share your thoughts on this story.

    Add your comment
    You must be signed in to add a comment. Sign-in | Register

    In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

    @ThingsExpo Stories
    Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
    Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
    The IoT Will Grow: In what might be the most obvious prediction of the decade, the IoT will continue to expand next year, with more and more devices coming online every single day. What isn’t so obvious about this prediction: where that growth will occur. The retail, healthcare, and industrial/supply chain industries will likely see the greatest growth. Forrester Research has predicted the IoT will become “the backbone” of customer value as it continues to grow. It is no surprise that retail is ...
    Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
    Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
    DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of bus...
    Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
    DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
    Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
    Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
    Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
    DXWorldEXPO LLC announced today that ICOHOLDER named "Media Sponsor" of Miami Blockchain Event by FinTechEXPO. ICOHOLDER give you detailed information and help the community to invest in the trusty projects. Miami Blockchain Event by FinTechEXPO has opened its Call for Papers. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to [email protected] Miami Blockchain Event by FinTechEXPO also offers s...
    The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
    As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...
    Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive ov...
    DXWorldEXPO | CloudEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
    Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
    With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...
    The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
    Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...