Welcome!

Microsoft Cloud Authors: Pat Romanski, Liz McMillan, Lori MacVittie, Elizabeth White, Yeshim Deniz

Related Topics: Microsoft Cloud

Microsoft Cloud: Article

Five Pillars of Visual Basic Retirement

Great Migrations LLC case study

Introduction
In the past, Visual Basic (VB) upgrades were fairly painless and inexpensive because Microsoft made new versions of VB backward compatible, but things are different this time. An upgrade to .NET brings with it a radical shift in terms of architecture, design, deployment, features, and tools. The upgrade will be even more challenging if you decide to move from the forgiving VB compiler to the rigorous C# compiler. Confronted with declining vendor and community support and major migration challenges, BMW Financial Services in Dublin, Ohio, set out to define a strategy that would allow us to adopt C# in an efficient and deliberate manner. Our objectives were to minimize disruption and costs and leverage the momentum of the platform change to move our capabilities forward. This article presents our strategy and some of the experiences we are encountering along the way.

On the Road to .NET
BMW Financial Services is committed to Microsoft technologies and has used Visual Basic (VB) as the mainstay of application development for over 10 years. Their applications began taking shape in 1994, even as VB was still growing up. In early 2000, I was hired to manage the newly established system architecture team. We found ourselves facing a wide variety of architectures and coding styles. Although we are almost purely a VB shop, we had several different application frameworks and our code employed a number of different "standards" for common tasks. We also used over one hundred different third-party COM components. Our intent was to standardize our architectures and to implement more intelligent code reuse, but such changes are expensive, and building a case purely on the basis of architecture goodness is near impossible. By 2001, Microsoft had begun to promote its new, improved development platform - Microsoft's next-generation toolset that would become .NET. We decided that this new platform should become a key component of our standardization strategy. As luck would have it, in early 2002 we found ourselves at a crossroads: we were about to embark on a major CRM package implementation. The package warranty did not allow us to use stored procedures - the other mainstay of our application development. We would have to use the package's APIs. By now, .NET was entering the mainstream, particularly for developing middle tier Web services. It was a match made in heaven; we needed a solid platform for integrating with the CRM package, and .NET fit the bill perfectly. By the end of the year, we had used C# (See Sidebar: Choosing a.NET Language) to design, build, and deploy a service-oriented middle tier that integrated our legacy applications with the CRM solution. More importantly, we had also educated management about the impending loss of VB support and got the go ahead to build additional .NET frameworks that would form the foundation for migrating to .NET. (Sidebar 1)

We planned three application frameworks to help us migrate: one for desktop, one for Web, and one for batch jobs. We fondly refer to these frameworks as DesktopCAFE, WebCAFE, (CAFÉ stands for Common Application Framework for the Enterprise), and BEEF (Batch Environment Execution Framework). Each of these frameworks leverages .NET and the Microsoft Application Blocks / Enterprise library to provide architecture support for quickly assembling robust, modular applications. (Sidebar 2)

By the end of 2003, we had published the Batch Architecture Strategy and the WebCAFE adoption strategy, and had also deployed a very basic version of DesktopCAFE. Also in 2003, our service-oriented middle tier grew at frightening pace. In our haste to provide an easy-to-use service framework for the CRM project, we put very little governance around the creation of new services. This was both good and bad. On the up side, the C# middle tier provided a ready alternative to writing more VB. On the other hand, it was the only alternative, and we soon had a library of over 100 services of a questionable purpose.

VB Retirement Strategy
Our VB Retirement Strategy was a small document (26 pages) with several parts. It presented the case for VB retirement. It described the as-is and the to-be states of our processes and architectures. And it spelled out our guiding principles of .NET adoption. Most of the strategy was dedicated to describing the process improvements and other efforts that would be needed to ensure success. One of the most important aspects of the strategy was the set of guiding principles that would steer our decisions about .NET adoption. These principles are listed below:

  • The effort will be gradual, spread across a three year timeline, and is scheduled to be complete prior to the Microsoft Visual Basic retirement date in March 2008.
  • We will seek to minimize impact to business projects by aligning migration efforts with scheduled projects and avoiding code freezes.
  • We will have a strong first year commitment to research and analysis so we can "get smarter" about the challenges up front and enjoy more predictable/efficient migrations in year two and three.
  • We will attempt to leverage automated translation tools to the extent feasible.
  • We will do a straight port of business logic, but allow structural changes in cases where technical incompatibilities exist or where we can move to common frameworks.
  • Architecture/logic improvements can also occur if the cost of doing migration and project work together is lower than the cost of doing them separately.
The bulk of the strategy was dedicated to describing the five processes we would need in order to ensure a successful migration. Our mission would be to mature these five processes and use them to power the migration. These five processes are described here:
  • Rearchitecting: This is implementing non-functional improvements to our systems. Our definition of rearchitecting is beyond the syntax changes and API replacements inherent in the VB-to-.NET translation. Rearchitecting is concerned with taking advantage of object-oriented principles, structured exception handling, and other .NET features. It also entails restructuring how the components and layers of the application interact with one another. New frameworks are a key aspect of the rearchitecting process. Another major aspect of rearchitecting is the implementation of transitional architectures that allows us to incrementally retire legacy architectures as we move to new ones.
  • Translation: This is the most basic work of the migration: converting working VB projects to functionally equivalent, maintainable C# projects. Our existing code is the best specification we have for our systems because it's accurate, detailed, and has been tested in use. Translation allows us to leverage this asset. The translation process deals with defining detailed VB-to-C# conversion standards, tuning conversion tools, and developing refactoring techniques.
  • Retooling: This is updating development processes and developer skills to work with .NET. The retooling effort is concerned with overcoming the challenges that mixed language development presents for our configuration management, build, and deployment procedures. Retooling also involves training developers and enforcing conversion standards.
  • Testing: This is making sure the first three processes are working in a way that limits defects. Our organization has a streamlined quality culture: "we only test what has changed." This model does not scale very well when almost everything is changing. We are still coming to grips with this, and we are taking steps to ensure that quality is built into the conversion process up-front rather than expecting to test it in at the end.
  • Managing: This is the work of coordinating technical teams, synergizing migration work with other projects, ensuring continuous improvements, monitoring progress, managing the budget, and removing obstacles.
These five processes interact to reduce the cost and increase the quality and speed of the .NET adoption effort. For example, an investment in translation capability will result in higher-quality C# code that is more correct and ready for rearchitecting. An investment in retooling will result in more repeatable translations, builds, and deployments, as well as smoother adoption of new architectures and tools. Rearchitecting will yield more complete specifications for translation and a solid target for migrated applications. Testing will ensure that other migration processes are working properly. Managing coordinates and balances these activities to make the most efficient use of program resources. Figure 1 shows how these processes relate to each other.

Estimating the Cost of VB Retirement
In addition to agreeing on a strategy, 2004 also saw us reach the more important .NET adoption milestones:

  • We estimated costs and established a program budget for the next three years.
  • We got buy-in from top-level management.
  • We completed the implementation of WebCAFE and fleshed out the next versions of DesktopCAFE.

More Stories By Mark Juras

Mark Juras is president of Great Migrations LLC, a technology solutions provider that specializes in helping people migrate their software applications from one programming language to another.

Comments (5) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Tushar.aipatwar 08/30/09 04:58:00 PM EDT

HI
Mark I like the case study also.
I am doing my dissertation on Code migration of
VB6.0 and C# can you please tell me where I can find more material for this topic.

Mark Juras 03/14/07 10:47:10 AM EDT

Since writing this article, I have had many discussions with IT professionals about the topic of software migration. I took what I learned in those discussions and wrote a second article that compares migration methodologies and addresses the negative perceptions about software translators. If you are curious, please take a look.

http://dotnet.sys-con.com/read/346924.htm

.NET News Desk 11/17/06 02:49:39 PM EST

In the past, Visual Basic (VB) upgrades were fairly painless and inexpensive because Microsoft made new versions of VB backward compatible, but things are different this time. An upgrade to .NET brings with it a radical shift in terms of architecture, design, deployment, features, and tools. The upgrade will be even more challenging if you decide to move from the forgiving VB compiler to the rigorous C# compiler.

Mike Conner 11/17/06 12:35:30 PM EST

This was a great case study.
Thank you for sharing.

Mark Juras 11/16/06 10:56:10 PM EST

The sample translation in the article is an early form and just one of many possible valid translations.

The power of the tool-assisted approach is its improvability and flexibility. If there is something we want to change with the translation we reconfigure the promula tool, rerun the translations, and check the result. We repeated this process a few times until we get .NET code that is clean and correct. Also, because of the speed of promulaBasic, rerunning the translations for hundreds of VBPs only takes a few minutes, so trying different configurations is feasible.

This up front investment paid off. Once tuned to our requirements, every new batch of translations meets our standards. We did this tuning for many types of coding patterns, not just data access.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...