Microsoft Cloud Authors: Nick Basinger, Kevin Benedict, Pat Romanski, Liz McMillan, Lori MacVittie

Related Topics: Microsoft Cloud

Microsoft Cloud: Article

Converting VB6 to VB.NET, Part II

Moving along

Last month (Vol. 2, issue 9), I gave an executive overview of the conversion process, and started looking at converting general VB6 code to VB.NET. This month I will finish general conversions, including DLLs, then start on database conversions. Next month, in the final segment, I will cover converting ASP.NET Web pages, and look at converting to VB.NET 2005 and C#.

Nothing Is Perfect
After the conversion wizard is done, the upgrade report will probably contain a list of many issues. A lot of these issues are minor things that nothing can be done about, but which in most cases will have little impact on the final program. For instance, if you use the ZOrder property in VB6, the wizard will convert it, but add a message that the behavior has changed. All of the issues flagged by the wizard will have a link to a URL explaining the problem. In the case of ZOrder, the difference between the VB6 behavior and the VB.NET behavior is in that in VB6 the form is brought to the front of the application; in VB.NET the application is also brought to the front of all other applications running on the computer. In most cases this is closer to the desired result than in the VB6 behavior.

The Dir command is another example. In VB6, using the Dir command to list the files in a directory always listed the "." and ".." first in the returned file list, the DIR command would always list them first. In VB.NET, "." and ".." are never listed in the Dir results. These were not always listed in VB6 - for instance, they are not part of the root "\" directory - so correct VB6 code checks for these directories. Such code will perform correctly in VB.NET. However, some VB6 programs just ignore the first two files returned by Dir, assuming they are "." and ".."; this code will always miss the first two files in VB.NET, and in some cases in VB6. This is another example of well-written code converting well.

Variables and Attributes
Most VB6 variable types have equivalent types in VB.NET, but there are a couple of twists. An Integer in VB6 is a 16-bit integer; in VB.NET, it is a 32-bit integer. The 16-bit equivalent in .NET is a Short, which is what the wizard will convert it to. Sometimes this is correct, but in cases where the 16-bit values reflect a VB6 limitation that does not exist in VB.NET, it makes more sense to upgrade the variable to the VB.NET 32 bit Integer. The preferred way to do this is to change the VB6 Integer to a VB6 Long (32-bit integer) before the conversion. A complete list of equivalent data types not only for VB6 and VB.NET, but also IDL (COM), the .NET framework, and MSIL (.NET assembly language) can be found at (Note From the Author: This URL will only work if Visual Studio is installed on your computer) ms-help://MS.VSCC.2003/MS.MSDNQTR.2003FEB.1033/dv_vstechart/html/ vbtchTroubleshootingNETInteroperability.htm.

The VB6 Date type is a simple 8-byte variable type; VB.NET uses the .NET DateTime class for the same functionality. In places where the old 8-byte VB6 type is needed, use the DateTime.Date functions FromOADate(x) or ToOADate(x). The wizard will make use of these functions when converting; for example, MyDate = 100 will be upgraded to MyDate = DateTime. Date.FromOADate(100). I have read that the VB6 Date type is one thing that just will not convert to .NET; these functions have done the few things I have required of them, so I suspect that those who say Date types are hard to upgrade are just not aware of these functions. There is so much in .NET that these mistakes are easy to make. When I first converted the MSChart control, I thought the Axis property had not converted (see Part 1 of this series), and spent a lot of time looking for a replacement control. This was not a total waste of time, as I switched to a pure .NET control, which I would have done eventually; but this did delay having working code for the first conversion pass. The lesson to be learned from this is that if you do not think a function exists in .NET, keep looking, you just might find it.

Another type that changes is the VB6 Currency type, which converts to a VB.NET Decimal type. I have heard this can cause issues, but the few tests I have run show them behaving the same.

Attributes are new to .NET. In VB6, a procedure had three things: visibility (public/private), scope (local/global), and type (Integer, etc). .NET adds an arbitrary number of optional attributes; these can be used by the compiler or by the program at runtime (or both). The COMVisible attribute is an example of an attribute used by the compiler at runtime. When added to a class, property, or function, it tells the compiler to create code that will allow that member to be accessed by legacy COM applications. You can create your own custom attributes (it is similar to creating a class), and check at runtime to see if it has been added to a function. Why and how you would do this is outside the scope of this article. The .NET framework provides a number of useful attributes, a few of which will be used later in this article.

DLLs can be a lot of things. Many are just libraries of simple functions; the Windows APIs are a good example of this type of DLL. Others can be much more complex; ActiveX controls, Excel add-ins, and OLE servers are some examples of more complex DLLs. ActiveX controls will be the only complex DLLs covered in this article.

Simple DLLs are easy to use in .NET. Just add one or more declaration statements to your program, and you can call the functions in the library just like they are subroutines in your own program. The upgrade wizard will do a good job of converting these Declare statements from VB6 to VB.NET. In addition to upgrading the variable types discussed above, there are two new .NET attributes, packing and pinning, that may need to be added to some of the parameters being passed.

To understand packing, let's look at an example. Consider a VB6 user-defined type that contains two Integers; after the conversion, we want them to still be 16-bit integers. As mentioned above, the first thing we need to do is change the type from Integer to Short. Now let's look at packing.

Suppose we declare an instance of this type in VB6. The compiler will put it in memory somewhere; to simplify this discussion, we assume the instance is placed at the beginning of memory, at address 0000. The two bytes of the first Integer will be placed at addresses 0000 and 0001. The second Integer will be at addresses 0002 and 0003. This minimizes use of memory, and has no performance penalty on the 16-bit processors (8086/80286) that Windows and VB were originally designed to run on. Modern 32-bit processors can access the first integer at full speed; they load the 32 bits starting at 0000, and just mask out the second 16 bits. Accessing the second Integer is more complex. When the processor loads the first 32 bits, the 16 bits it needs are in the wrong position, so there is a small performance penalty when the processor shuffles the bits to where they need to be. Obviously, loading variables is a very common thing for a processor to do, so this small performance penalty becomes a big factor when you consider all the variables used in a program. .NET prevents this slowdown by putting all variables on 32-bit boundaries (this has been a compile option in C++ for a while). Under .NET, in this instance of the type, the first Integer will be at addresses 0000 and 0001, just like before, but the second integer will be at 0004 and 0005; addresses 0002 and 0003 will not be used. This wastes memory but the program will run faster.

The problem with this is that if you pass the user type to a DLL, that DLL will almost certainly expect the type to be "packed," like in VB6. .NET handles this with the "packed" attribute. Adding this attribute causes the VB.NET compiler to pack the type like in VB6, saving memory and maintaining compatibility at the expense of speed. The following example shows how this is done.

Userdef  MyType
	IntOne as Integer
	IntTwo as Integer
End Type

In cases where user-defined types, arrays, or fixed-length strings are passed as parameters to the DLL, the upgrade wizard will add warnings that marshaling attributes may need to be added. Even though the wizard places the warning at the Declare statement, the attribute (as in the above example) needs to be added where the parameter type itself is defined.

Garbage collection can bring up another issue with DLLs - pinning. In .NET, the memory manager/ garbage collector can and will move variables around in memory while a program is running. For instance, if a program creates five 100-byte arrays, they will probably be in a contiguous 500-byte block of memory. If the array taking up the first 100 bytes is released and gets garbage collected, the other 400 bytes will be moved down 100 bytes so there are no unused "holes" in the memory block. If a program passes a reference to a block of memory to a DLL, and that memory is later moved, the DLL's reference will no longer be valid, causing a major bug. To prevent this, a variable in C# can be "pinned" with the fixed command, which tells the garbage collector not to move the referenced memory; VB.NET does not have an equivalent to the "fixed" command, so I am unsure of how this is handled in VB.NET.

DLLs, DAO, RDO, ADO, and AD.NET; the History of VB DBs
In the early versions of VB, there were no database controls, and databases were accessed by vendor-specific DLLs, but VB's power and ease of creating forms still made it a favorite among database programmers. One thing that made it so easy to create forms in VB was VBXs, the forerunners of ActiveX controls. In VB3, Microsoft added DAO (Data Access Objects), allowing easy access to ODBC databases, and a good thing was started. RDO (Remote Data Objects) came next. Where DAO focused on connecting to small Access-type databases, RDO targeted a different market, large databases such as MS SQL and Oracle, so to a large extent RDO complemented rather than replaced DAO. ADO was to be the technology to combine the two.

Upgrading from the earlier DAO or RDO architectures will be difficult. The Data control that was typically used by DAO and RDO programmers is not supported by .NET, and the upgrade wizard just marks each place the data control is used with a comment that it is no longer supported. Following the comment to the documentation just leads to generic information about upgrading with no further help.

I have had some success creating an ActiveX control in VB6, adding the data control to the new ActiveX control, then adding functions and subroutines to make the data control accessible to its container in the VB client program. The accessors looked like

Public Sub SetDataBaseName(Name As String)
    Data.DatabaseName = Name
End Sub

Public Sub SetRecordSource(Source As String)
    Data.RecordSource = Source
End Sub

Public Sub Refresh()
End Sub

Public Sub RecordSetAddNew()
End Sub

Public Sub RecordSetFields(Field As String, Name As String)
    Data.Recordset.Fields(Field) = Name
End Sub

In the client VB program,


was replaced with

Data1.Recordset.Fields("[Territory Num]") = RTrim(LTrim(txtTerritoryNum.Text))

which was replaced with

Data1.RecordSetFields "[Territory Num], RTrim(LTrim(txtTerritoryNum.Text)),

and so on. This worked well for strings, other RecordSetFields needed for other types of fields, or RecordSetField declared using Variant as the value type, and detecting the proper type by using the VarType function. This works all right if you just want to use the Data control by itself, but the whole point of using the Data control is to bind it to controls such as the DBGrid. To make that work correctly would require making the custom control a true VB data source. This is beyond the scope of this article, and might not work once converted to .NET. For anyone wishing to try this (and it might work), search for "Creating a Data Source" in the MSDN help files for more information.

I used a similar trick to convert a VB6 form to .NET. The form was designed to allow adding new records to an existing database. It could be considered as having three parts, a Data control for accessing the database, a DBGrid (from VB5) for displaying the records in the database, and a set of TextBoxes for inputting the values for the new record. I took the custom data control I created above and added a DBGrid (from VB5) to it. I included some resizing logic so that the grid always covered the entire control area. Next, I set the data source for grid to the data control and built the ActiveX control. I removed both the Data control and the DBGrid from the VB program I was converting, and replaced them with my new custom control containing both. It upgraded to VB.NET and ran fine. Again, being creative in using custom ActiveX controls can make otherwise impossible conversions fairly simple.

ADO.NET is one of the biggest changes yet in VB database connectivity. In ADO, programs were meant to always be connected to the database (note that ADO could be used in a disconnected mode, but most VB6 programs used the normal connected mode). In some ways, this simplified database programming a bit because changes made to records and tables in the program were automatically updated in the database. It also causes a scalability issue because one of the things most database vendors charge for is the number of allowed simultaneous connections. Since ADO connections tend to exist for the entire time the VB program is running, a license is tied up as well. Sometimes this does not matter, but if the program has a lot of users, or even worse, is accessed over the Internet, tying up a license for an extended time like this can get very expensive. Having a large number of open connections to the database can slow overall performance as well.

With ADO.NET, connections are opened to read, write, or update the database, and closed while the program or user works with the data. This causes a little more work for the programmer, but generates a slightly faster, and much more scalable program.

One half-step way to get around a complete rewrite is to switch to .NET, but continue to use ADO instead of ADO.NET. This can be done because ADO is implemented as a COM object in msado15.dll. Like all COM objects .NET needs a wrapper to access the ADO DLL. The .NET framework includes a wrapper for this DLL called adodb.dll, but you can also create your own from any version of ADO by using the tlbimport.exe utility that was briefly mentioned last month. Another option is, as always, that if a form can be separated from the rest of the application, it can be converted into an ActiveX control and added to a .NET form. Another advantage to using one of these techniques is that ADO has a few functions that are not available in ADO.NET, most notably server-side cursors and ADO extensions (ADOX).

I have already mentioned that moving from the connected ADO paradigm to the disconnected ADO.NET paradigm will give a boost in performance, and a big boost in scalability. There are other reasons to switch to ADO.NET, too. ADO uses recordsets that resemble tables; ADO.NET uses datasets that resemble databases. To access more than one table in ADO, the query forming the recordset had to do a join across the tables; in ADO.NET, the dataset can contain all the tables, as well as DataReleation objects, which resemble foreign keys in a real database. This can make complex data manipulations much easier. The old client-side Cursor functionality can still be found in the DataReader object. Another big difference is that while ADO can only load data into record sets, ADO.NET can also load data into many "normal" programming structures such as arrays and lists. Where ADO tables were accessed by forward only cursors, ADO.NET tables have indexers, and can be accessed like any other collection. Also, most of the .NET controls, even text boxes, allow data binding not to just text, but to other properties such as color (Note From the Author: This URL will only work if Visual Studio is installed on your computer) (see ms-help://MS.VSCC.2003/MS.MSDNQTR.2003FEB.1033/dnadvnet/html/vbnet02122002.htm).

Finally, ADO.NET is based on XML, which makes moving data between objects and programs easier and more efficient; using XML for this instead of ADO's COM calls also means that data can pass through firewalls without configuration changes.

Next Up
That completes our coverage of converting general and database code. Next month, in the final installment, we will cover ASP to ASP.NET conversion, and converting to VB.NET 2005 and C#; and conclude with some thoughts on when it makes sense to convert.

More Stories By Dennis Hayes

Dennis Hayes is a programmer at Georgia Tech in Atlanta Georgia where he writes software for the Adult Cognition Lab in the Psychology Department. He has been involved with the Mono project for over six years, and has been writing the Monkey Business column for over five years.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

IoT & Smart Cities Stories
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...
Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
Whenever a new technology hits the high points of hype, everyone starts talking about it like it will solve all their business problems. Blockchain is one of those technologies. According to Gartner's latest report on the hype cycle of emerging technologies, blockchain has just passed the peak of their hype cycle curve. If you read the news articles about it, one would think it has taken over the technology world. No disruptive technology is without its challenges and potential impediments t...
If a machine can invent, does this mean the end of the patent system as we know it? The patent system, both in the US and Europe, allows companies to protect their inventions and helps foster innovation. However, Artificial Intelligence (AI) could be set to disrupt the patent system as we know it. This talk will examine how AI may change the patent landscape in the years to come. Furthermore, ways in which companies can best protect their AI related inventions will be examined from both a US and...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Bill Schmarzo, Tech Chair of "Big Data | Analytics" of upcoming CloudEXPO | DXWorldEXPO New York (November 12-13, 2018, New York City) today announced the outline and schedule of the track. "The track has been designed in experience/degree order," said Schmarzo. "So, that folks who attend the entire track can leave the conference with some of the skills necessary to get their work done when they get back to their offices. It actually ties back to some work that I'm doing at the University of San...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...