Microsoft Cloud Authors: Andreas Grabner, Nick Basinger, Kevin Benedict, Pat Romanski, Liz McMillan

Related Topics: Microsoft Cloud

Microsoft Cloud: Article

Converting VB6 to VB.NET, Part II

Moving along

Last month (Vol. 2, issue 9), I gave an executive overview of the conversion process, and started looking at converting general VB6 code to VB.NET. This month I will finish general conversions, including DLLs, then start on database conversions. Next month, in the final segment, I will cover converting ASP.NET Web pages, and look at converting to VB.NET 2005 and C#.

Nothing Is Perfect
After the conversion wizard is done, the upgrade report will probably contain a list of many issues. A lot of these issues are minor things that nothing can be done about, but which in most cases will have little impact on the final program. For instance, if you use the ZOrder property in VB6, the wizard will convert it, but add a message that the behavior has changed. All of the issues flagged by the wizard will have a link to a URL explaining the problem. In the case of ZOrder, the difference between the VB6 behavior and the VB.NET behavior is in that in VB6 the form is brought to the front of the application; in VB.NET the application is also brought to the front of all other applications running on the computer. In most cases this is closer to the desired result than in the VB6 behavior.

The Dir command is another example. In VB6, using the Dir command to list the files in a directory always listed the "." and ".." first in the returned file list, the DIR command would always list them first. In VB.NET, "." and ".." are never listed in the Dir results. These were not always listed in VB6 - for instance, they are not part of the root "\" directory - so correct VB6 code checks for these directories. Such code will perform correctly in VB.NET. However, some VB6 programs just ignore the first two files returned by Dir, assuming they are "." and ".."; this code will always miss the first two files in VB.NET, and in some cases in VB6. This is another example of well-written code converting well.

Variables and Attributes
Most VB6 variable types have equivalent types in VB.NET, but there are a couple of twists. An Integer in VB6 is a 16-bit integer; in VB.NET, it is a 32-bit integer. The 16-bit equivalent in .NET is a Short, which is what the wizard will convert it to. Sometimes this is correct, but in cases where the 16-bit values reflect a VB6 limitation that does not exist in VB.NET, it makes more sense to upgrade the variable to the VB.NET 32 bit Integer. The preferred way to do this is to change the VB6 Integer to a VB6 Long (32-bit integer) before the conversion. A complete list of equivalent data types not only for VB6 and VB.NET, but also IDL (COM), the .NET framework, and MSIL (.NET assembly language) can be found at (Note From the Author: This URL will only work if Visual Studio is installed on your computer) ms-help://MS.VSCC.2003/MS.MSDNQTR.2003FEB.1033/dv_vstechart/html/ vbtchTroubleshootingNETInteroperability.htm.

The VB6 Date type is a simple 8-byte variable type; VB.NET uses the .NET DateTime class for the same functionality. In places where the old 8-byte VB6 type is needed, use the DateTime.Date functions FromOADate(x) or ToOADate(x). The wizard will make use of these functions when converting; for example, MyDate = 100 will be upgraded to MyDate = DateTime. Date.FromOADate(100). I have read that the VB6 Date type is one thing that just will not convert to .NET; these functions have done the few things I have required of them, so I suspect that those who say Date types are hard to upgrade are just not aware of these functions. There is so much in .NET that these mistakes are easy to make. When I first converted the MSChart control, I thought the Axis property had not converted (see Part 1 of this series), and spent a lot of time looking for a replacement control. This was not a total waste of time, as I switched to a pure .NET control, which I would have done eventually; but this did delay having working code for the first conversion pass. The lesson to be learned from this is that if you do not think a function exists in .NET, keep looking, you just might find it.

Another type that changes is the VB6 Currency type, which converts to a VB.NET Decimal type. I have heard this can cause issues, but the few tests I have run show them behaving the same.

Attributes are new to .NET. In VB6, a procedure had three things: visibility (public/private), scope (local/global), and type (Integer, etc). .NET adds an arbitrary number of optional attributes; these can be used by the compiler or by the program at runtime (or both). The COMVisible attribute is an example of an attribute used by the compiler at runtime. When added to a class, property, or function, it tells the compiler to create code that will allow that member to be accessed by legacy COM applications. You can create your own custom attributes (it is similar to creating a class), and check at runtime to see if it has been added to a function. Why and how you would do this is outside the scope of this article. The .NET framework provides a number of useful attributes, a few of which will be used later in this article.

DLLs can be a lot of things. Many are just libraries of simple functions; the Windows APIs are a good example of this type of DLL. Others can be much more complex; ActiveX controls, Excel add-ins, and OLE servers are some examples of more complex DLLs. ActiveX controls will be the only complex DLLs covered in this article.

Simple DLLs are easy to use in .NET. Just add one or more declaration statements to your program, and you can call the functions in the library just like they are subroutines in your own program. The upgrade wizard will do a good job of converting these Declare statements from VB6 to VB.NET. In addition to upgrading the variable types discussed above, there are two new .NET attributes, packing and pinning, that may need to be added to some of the parameters being passed.

To understand packing, let's look at an example. Consider a VB6 user-defined type that contains two Integers; after the conversion, we want them to still be 16-bit integers. As mentioned above, the first thing we need to do is change the type from Integer to Short. Now let's look at packing.

Suppose we declare an instance of this type in VB6. The compiler will put it in memory somewhere; to simplify this discussion, we assume the instance is placed at the beginning of memory, at address 0000. The two bytes of the first Integer will be placed at addresses 0000 and 0001. The second Integer will be at addresses 0002 and 0003. This minimizes use of memory, and has no performance penalty on the 16-bit processors (8086/80286) that Windows and VB were originally designed to run on. Modern 32-bit processors can access the first integer at full speed; they load the 32 bits starting at 0000, and just mask out the second 16 bits. Accessing the second Integer is more complex. When the processor loads the first 32 bits, the 16 bits it needs are in the wrong position, so there is a small performance penalty when the processor shuffles the bits to where they need to be. Obviously, loading variables is a very common thing for a processor to do, so this small performance penalty becomes a big factor when you consider all the variables used in a program. .NET prevents this slowdown by putting all variables on 32-bit boundaries (this has been a compile option in C++ for a while). Under .NET, in this instance of the type, the first Integer will be at addresses 0000 and 0001, just like before, but the second integer will be at 0004 and 0005; addresses 0002 and 0003 will not be used. This wastes memory but the program will run faster.

The problem with this is that if you pass the user type to a DLL, that DLL will almost certainly expect the type to be "packed," like in VB6. .NET handles this with the "packed" attribute. Adding this attribute causes the VB.NET compiler to pack the type like in VB6, saving memory and maintaining compatibility at the expense of speed. The following example shows how this is done.

Userdef  MyType
	IntOne as Integer
	IntTwo as Integer
End Type

In cases where user-defined types, arrays, or fixed-length strings are passed as parameters to the DLL, the upgrade wizard will add warnings that marshaling attributes may need to be added. Even though the wizard places the warning at the Declare statement, the attribute (as in the above example) needs to be added where the parameter type itself is defined.

Garbage collection can bring up another issue with DLLs - pinning. In .NET, the memory manager/ garbage collector can and will move variables around in memory while a program is running. For instance, if a program creates five 100-byte arrays, they will probably be in a contiguous 500-byte block of memory. If the array taking up the first 100 bytes is released and gets garbage collected, the other 400 bytes will be moved down 100 bytes so there are no unused "holes" in the memory block. If a program passes a reference to a block of memory to a DLL, and that memory is later moved, the DLL's reference will no longer be valid, causing a major bug. To prevent this, a variable in C# can be "pinned" with the fixed command, which tells the garbage collector not to move the referenced memory; VB.NET does not have an equivalent to the "fixed" command, so I am unsure of how this is handled in VB.NET.

DLLs, DAO, RDO, ADO, and AD.NET; the History of VB DBs
In the early versions of VB, there were no database controls, and databases were accessed by vendor-specific DLLs, but VB's power and ease of creating forms still made it a favorite among database programmers. One thing that made it so easy to create forms in VB was VBXs, the forerunners of ActiveX controls. In VB3, Microsoft added DAO (Data Access Objects), allowing easy access to ODBC databases, and a good thing was started. RDO (Remote Data Objects) came next. Where DAO focused on connecting to small Access-type databases, RDO targeted a different market, large databases such as MS SQL and Oracle, so to a large extent RDO complemented rather than replaced DAO. ADO was to be the technology to combine the two.

Upgrading from the earlier DAO or RDO architectures will be difficult. The Data control that was typically used by DAO and RDO programmers is not supported by .NET, and the upgrade wizard just marks each place the data control is used with a comment that it is no longer supported. Following the comment to the documentation just leads to generic information about upgrading with no further help.

I have had some success creating an ActiveX control in VB6, adding the data control to the new ActiveX control, then adding functions and subroutines to make the data control accessible to its container in the VB client program. The accessors looked like

Public Sub SetDataBaseName(Name As String)
    Data.DatabaseName = Name
End Sub

Public Sub SetRecordSource(Source As String)
    Data.RecordSource = Source
End Sub

Public Sub Refresh()
End Sub

Public Sub RecordSetAddNew()
End Sub

Public Sub RecordSetFields(Field As String, Name As String)
    Data.Recordset.Fields(Field) = Name
End Sub

In the client VB program,


was replaced with

Data1.Recordset.Fields("[Territory Num]") = RTrim(LTrim(txtTerritoryNum.Text))

which was replaced with

Data1.RecordSetFields "[Territory Num], RTrim(LTrim(txtTerritoryNum.Text)),

and so on. This worked well for strings, other RecordSetFields needed for other types of fields, or RecordSetField declared using Variant as the value type, and detecting the proper type by using the VarType function. This works all right if you just want to use the Data control by itself, but the whole point of using the Data control is to bind it to controls such as the DBGrid. To make that work correctly would require making the custom control a true VB data source. This is beyond the scope of this article, and might not work once converted to .NET. For anyone wishing to try this (and it might work), search for "Creating a Data Source" in the MSDN help files for more information.

I used a similar trick to convert a VB6 form to .NET. The form was designed to allow adding new records to an existing database. It could be considered as having three parts, a Data control for accessing the database, a DBGrid (from VB5) for displaying the records in the database, and a set of TextBoxes for inputting the values for the new record. I took the custom data control I created above and added a DBGrid (from VB5) to it. I included some resizing logic so that the grid always covered the entire control area. Next, I set the data source for grid to the data control and built the ActiveX control. I removed both the Data control and the DBGrid from the VB program I was converting, and replaced them with my new custom control containing both. It upgraded to VB.NET and ran fine. Again, being creative in using custom ActiveX controls can make otherwise impossible conversions fairly simple.

ADO.NET is one of the biggest changes yet in VB database connectivity. In ADO, programs were meant to always be connected to the database (note that ADO could be used in a disconnected mode, but most VB6 programs used the normal connected mode). In some ways, this simplified database programming a bit because changes made to records and tables in the program were automatically updated in the database. It also causes a scalability issue because one of the things most database vendors charge for is the number of allowed simultaneous connections. Since ADO connections tend to exist for the entire time the VB program is running, a license is tied up as well. Sometimes this does not matter, but if the program has a lot of users, or even worse, is accessed over the Internet, tying up a license for an extended time like this can get very expensive. Having a large number of open connections to the database can slow overall performance as well.

With ADO.NET, connections are opened to read, write, or update the database, and closed while the program or user works with the data. This causes a little more work for the programmer, but generates a slightly faster, and much more scalable program.

One half-step way to get around a complete rewrite is to switch to .NET, but continue to use ADO instead of ADO.NET. This can be done because ADO is implemented as a COM object in msado15.dll. Like all COM objects .NET needs a wrapper to access the ADO DLL. The .NET framework includes a wrapper for this DLL called adodb.dll, but you can also create your own from any version of ADO by using the tlbimport.exe utility that was briefly mentioned last month. Another option is, as always, that if a form can be separated from the rest of the application, it can be converted into an ActiveX control and added to a .NET form. Another advantage to using one of these techniques is that ADO has a few functions that are not available in ADO.NET, most notably server-side cursors and ADO extensions (ADOX).

I have already mentioned that moving from the connected ADO paradigm to the disconnected ADO.NET paradigm will give a boost in performance, and a big boost in scalability. There are other reasons to switch to ADO.NET, too. ADO uses recordsets that resemble tables; ADO.NET uses datasets that resemble databases. To access more than one table in ADO, the query forming the recordset had to do a join across the tables; in ADO.NET, the dataset can contain all the tables, as well as DataReleation objects, which resemble foreign keys in a real database. This can make complex data manipulations much easier. The old client-side Cursor functionality can still be found in the DataReader object. Another big difference is that while ADO can only load data into record sets, ADO.NET can also load data into many "normal" programming structures such as arrays and lists. Where ADO tables were accessed by forward only cursors, ADO.NET tables have indexers, and can be accessed like any other collection. Also, most of the .NET controls, even text boxes, allow data binding not to just text, but to other properties such as color (Note From the Author: This URL will only work if Visual Studio is installed on your computer) (see ms-help://MS.VSCC.2003/MS.MSDNQTR.2003FEB.1033/dnadvnet/html/vbnet02122002.htm).

Finally, ADO.NET is based on XML, which makes moving data between objects and programs easier and more efficient; using XML for this instead of ADO's COM calls also means that data can pass through firewalls without configuration changes.

Next Up
That completes our coverage of converting general and database code. Next month, in the final installment, we will cover ASP to ASP.NET conversion, and converting to VB.NET 2005 and C#; and conclude with some thoughts on when it makes sense to convert.

More Stories By Dennis Hayes

Dennis Hayes is a programmer at Georgia Tech in Atlanta Georgia where he writes software for the Adult Cognition Lab in the Psychology Department. He has been involved with the Mono project for over six years, and has been writing the Monkey Business column for over five years.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

IoT & Smart Cities Stories
Today's workforce is trading their cubicles and corporate desktops in favor of an any-location, any-device work style. And as digital natives make up more and more of the modern workforce, the appetite for user-friendly, cloud-based services grows. The center of work is shifting to the user and to the cloud. But managing a proliferation of SaaS, web, and mobile apps running on any number of clouds and devices is unwieldy and increases security risks. Steve Wilson, Citrix Vice President of Cloud,...
Artificial intelligence, machine learning, neural networks. We're in the midst of a wave of excitement around AI such as hasn't been seen for a few decades. But those previous periods of inflated expectations led to troughs of disappointment. This time is (mostly) different. Applications of AI such as predictive analytics are already decreasing costs and improving reliability of industrial machinery. Pattern recognition can equal or exceed the ability of human experts in some domains. It's devel...
The term "digital transformation" (DX) is being used by everyone for just about any company initiative that involves technology, the web, ecommerce, software, or even customer experience. While the term has certainly turned into a buzzword with a lot of hype, the transition to a more connected, digital world is real and comes with real challenges. In his opening keynote, Four Essentials To Become DX Hero Status Now, Jonathan Hoppe, Co-Founder and CTO of Total Uptime Technologies, shared that ...
The Japan External Trade Organization (JETRO) is a non-profit organization that provides business support services to companies expanding to Japan. With the support of JETRO's dedicated staff, clients can incorporate their business; receive visa, immigration, and HR support; find dedicated office space; identify local government subsidies; get tailored market studies; and more.
As you know, enterprise IT conversation over the past year have often centered upon the open-source Kubernetes container orchestration system. In fact, Kubernetes has emerged as the key technology -- and even primary platform -- of cloud migrations for a wide variety of organizations. Kubernetes is critical to forward-looking enterprises that continue to push their IT infrastructures toward maximum functionality, scalability, and flexibility. As they do so, IT professionals are also embr...
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
As the fourth industrial revolution continues to march forward, key questions remain related to the protection of software, cloud, AI, and automation intellectual property. Recent developments in Supreme Court and lower court case law will be reviewed to explain the intricacies of what inventions are eligible for patent protection, how copyright law may be used to protect application programming interfaces (APIs), and the extent to which trademark and trade secret law may have expanded relev...
When Enterprises started adopting Hadoop-based Big Data environments over the last ten years, they were mainly on-premise deployments. Organizations would spin up and manage large Hadoop clusters, where they would funnel exabytes or petabytes of unstructured data.However, over the last few years the economics of maintaining this enormous infrastructure compared with the elastic scalability of viable cloud options has changed this equation. The growth of cloud storage, cloud-managed big data e...
Your applications have evolved, your computing needs are changing, and your servers have become more and more dense. But your data center hasn't changed so you can't get the benefits of cheaper, better, smaller, faster... until now. Colovore is Silicon Valley's premier provider of high-density colocation solutions that are a perfect fit for companies operating modern, high-performance hardware. No other Bay Area colo provider can match our density, operating efficiency, and ease of scalability.
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...