Microsoft Cloud Authors: Elizabeth White, Yeshim Deniz, Serafima Al, Janakiram MSV, John Katrick

Related Topics: Microsoft Cloud

Microsoft Cloud: Article

MySQL the .NET Way

MySQL the .NET Way

The .NET Framework is an exciting and enabling technology, allowing developers using many different languages and platforms to share tools and components like never before. Open-source efforts to bring .NET to Linux and other platforms are starting to pay off, and I can't think of a better way to use an open-source version of .NET than to work with one of the best opensource databases, MySQL.

There are two major opensource .NET initiatives under way, Mono and Portable.NET. At the moment, the Portable.NET project does not have an ADO.NET stack available, so in this article I'll use Mono as my .NET vehicle of choice. There are several MySQL data providers available, including one built into the Mono package, but I'll focus on a provider from ByteFX, Inc. This provider is unique in that it is open source, written in 100% managed C# code, and Mono compatible.

The sample code included with this article has been tested using Mono for Windows and Linux. You should also note that any mention of the phrase .NET, unless distinctly specified, refers equally to Microsoft .NET and Mono.

Getting Started
One of the best ways to understand a new component is to build something with it. With that in mind, we're going to build a simple app using the ByteFX data provider and Mono. To prepare, install a stable copy of Mono on your system. I used version 0.17 on Windows XP Professional (a nice Windows installer can be found at www.go-mono.com/download.html). Next we'll need a place to work, so create a folder anywhere on your system to hold the files for our app. Finally, download the latest binary release of the data provider from www.sourceforge.net/projects/mysqlnet. You'll need to get version 0.65 or later to be Mono compatible. Unzip the archive into the folder you created (it should include assemblies ByteFX.Data.dll and SharpZipLib.dll) and you're ready to start!

The file ByteFX.Data.dll is the data provider, while SharpZipLib.dll is a support assembly that provides data compression/decompression capability. SharpZipLib is licensed under the LGPL and available from www.icsharpcode.net.

The application we're going to build is a simple console application that takes connection information on the command line and then reads SQL statements from the console and executes them. We'll include the ability to execute the SQL inside a transaction. We'll also discuss using the provider to synchronize changes in a dataset with the MySQL server. I'll call this application MonoME (ME for minimalist example!).

It's Good to Have Connections
Before we can do anything with MySQL, we have to acquire a connection to the database and be authenticated. This is accomplished using the MySQLConnection object. Listing 1 shows the initial layout of MonoME, along with the GetConnection method, which creates a MySQLConnection object and stores it in the static member variable _conn. (The code for this article can be downloaded from www.sys-con.com/dotnet/sourcec.cfm.) There are a number of arguments that can be passed on the command line and used to construct the connection string. The connection string uses the same syntax as that used by the SQL Server provider and allows all the same aliases (server equals data source, password equals pwd, etc). The driver does support a nonstandard option called "use compression". This option instructs the driver and server to use compression for all communications, which can greatly speed up large queries over slow links.

There are two other things to notice here. One is this line near the top:

using ByteFX.Data.MySQLClient;

This line makes it clear that we will be using components of the MySQLClient assembly. The other thing to notice here is the use of try/catch to catch any MySQLClient exceptions that might occur while opening the connection. It should be standard practice in your client applications to wrap database operations in a try/catch block to catch any unforeseen errors. Assuming no exceptions were thrown, we should now be authenticated and ready to roll!

Compiling Listing 1 (All of the code for this article can be downloaded from www.sys-con.com/dotnet/sourcec.cfm.) with Mono couldn't be easier. Simply open up a command line in the folder where you saved Listing 1 and the two assemblies from earlier, and type this:

mcs –r ByteFX.Data.dll Listing1.cs

mcs is the Mono C# compiler and the –r option tells it to include a reference to the ByteFX.Data assembly in the resulting application. Assuming no errors were found, you should now have a new file called Listing1.exe. You can execute this file with the following command (remember to include the parameters GetConnection is looking for):

Mono listing1.exe –u <username> -d <database> -h server

Taking Command
The next thing for MonoME to do is read SQL strings for execution. You didn't come here to read about reading strings off consoles, so I'll just refer you to Listing 2 and skip ahead to the fun part, creating and using MySQLCommand objects.

Each of the SQL strings read in needs to be executed against the database server. This is accomplished with the MySQLCommand object. Commands can be created either by calling _Connection.CreateCommand() or by instantiating one directly. In Listing 2 we've created a method called ExecuteCommand. In this method, we create a MySQLCommand object using the existing connection. We'll use this method to execute the SQL that is entered at the console. This method will print out the resultset in a comma-delimited format.

Creating the MySQLCommand object requires a MySQLConnection object to use and the SQL to execute. These can be provided using properties or passed into the constructor. Once the command object is ready, the SQL can be executed and results obtained using one of three methods. ExecuteNonQuery should be used to execute any nonselect SQL such as delete, insert, or update commands. ExecuteScalar will execute a select command but will return only the first column as a single object. This can be useful to quickly retrieve the value of a builtin function such as MySQL's last_insert_id (). ExecuteReader will execute a select statement and return an object of type MySQLDataReader. This object will allow you to iterate over the resultset, retrieve field information and values, and even return resultset metadata.

An interesting item to consider is the execution of multiple SQL statements in a single call to the MySQLCommand object. This is common using the MySQLDataAdapter class, which we'll discuss a bit later. An example query might be this:

Update test set name='John' where
id=1; select * from test;

The data provider supports this type of operation, but you should be aware of the behavior. Also, multiple nonselect commands can be executed in a single call to Execute NonQuery, and the total number of records affected for the entire batch is returned. It's currently impossible to retrieve affected record counts for the individual SQL statements. Multiple select statements can be executed with ExecuteReader, but only the last statement will actually be loaded into the reader for processing.

It's All About the Data
MySQLDataReader is a forwardonly, nonbuffering class that retrieves the results of a select SQL command against a MySQL data source. For MonoME, we want to display the field data in a reasonably nice format. To do that, we're going to need the number and names of the fields contained in the resultset. This short bit of code gives us the name of each of the columns.

for (int x=0; x <
reader.FieldCount; x++)
String fieldname =

Now that we know the column names, we want to retrieve and show the data. The procedure is very similar to the older ADO way of traversing recordsets. The following code shows how to cycle through the recordset, displaying column values.

for (int col=0; col <
reader.FieldCount; col++)
object o = reader.GetValue(col);
Console.Write( o.ToString() );

The type of object returned by GetValue depends on the type of that column. You could read the value into a particular type using the same GetValue method, but then you'd need to cast like this:

UInt32 id = (UInt32)reader.GetValue(col);

To avoid casting every column, the data reader has several helper methods available that return a column's value in a particular type. Here are two examples:

String s = reader.GetString(col);
Int32 id = reader.GetInt32(col);

The actual bytes that make up the value of a column are also available using the GetBytes method. Note that GetBytes can return the data for any field type; it's the only way to get the data for BLOB (binary large object) and TEXT (which are also BLOB) fields.

MySQLDataReader also implements the IEnumerable interface. This interface is critical to proper execution of the data reader in all environments. The foreach keyword in C# provides a good example of such an environment. In C#, foreach is used to iterate over a set of items. Here's a brief example:

foreach (char c in "Hello")
//do something with c

Because MySQLDataReader implements the IEnumerable interface, we can write this:

foreach (IDataRecord rec in reader)

In the example above, IDataRecord is an interface defined in the .NET Framework and represents a single record of a resultset.

When All Else Fails, Use an Adapter
MonoME is pretty simple, but in a real-world application data access and manipulation is more complex. One of the tools .NET gives you to deal with this complexity is the data adapter. The ByteFX driver provides a compliant implementation in the MySQLDataAdapter class.

As you have already learned in previous issues of .NET Developer's Journal, the DataSet is a very powerful object used to contain data and changes made to that data in one or more tables. DataSet objects are filled with data by calling the data adapter's Fill method and synchronized with the data source through the Update method. Listing 3 is an example of a fully populated MySQLDataAdapter. It is almost identical to the code you would use for SqlDataAdapter (SQL Server).

The key thing to notice in Listing 3 is that MySQLCommand objects are constructed for each command type: select, insert, update, and delete. While this is not required, it is advisable because it enables the adapter to handle any type of change that occurs in the data tables covered by that adapter. When you have changes in your dataset, you would call the Update method.

// change the first column
ds.Tables["test"].Rows[0][0] = 2;
DataSet changes = ds.GetChanges();

GetChanges is a method of the DataSet object and returns a DataSet object containing all the changed rows (newly added, changed, or deleted). This is an optimization, as it prevents large amounts of data from being sent back to the data source when only a few rows may have changed. The Update method sends those changes to the data source, calling one of the command objects we created in each case. The Merge method on the next line appears unnecessary but is really very important. In many cases, the server may update information in the changed data (such as an autoincrement field), and we want to capture those changes. Finally, we accept the changes in our local dataset using AcceptChanges. This gets us back to an unchanged state.

I'm sure you've noticed that the code required to create our data adapter, while not bad in this example, could get quite lengthy. That's where MySQLCommandBuilder helps out. In a typical scenario in which you're accessing a single table, you can skip building command objects for update, insert, and delete and let MySQLCommandBuilder do the work for you. To use it, you need to supply your adapter with a select command and then register the builder. Here's how:

da.SelectCommand = new
MySQLCommand("select * from
test", conn);
MySQLCommandBuilder cb = new
MySQLCommandBuilder( da );
da.Update(ds); //
update changes

The command builder uses the select command to retrieve enough metadata from the data source to construct the remaining commands. The command builder needs the select command to include a primary key or unique column. In fact, if one isn't present an Invalid Operation exception will be thrown and the commands will not be generated. It's also important to note that MySQLCommandBuilder can be used only for single-table scenarios.

Real Programmers Use Parameters
In order for the MySQLCommand objects to be generic and not tied to a single record, we need to use parameters. The data provider includes the classes MySQLParameter and MySQLParameterCollection. The MySQLCommand class includes a property called Parameters that is used to assign parameters to the command. Also, when assigning parameters, you'll need to give the corresponding column name from the table and the data type of the column. The data type given comes from the MySQLDbType enumeration. Here is an example of adding a parameter to a command object.


In this example, we're adding a new parameter with the name "@title" mapped to the column "title" of type VarChar. The MySQL data provider also provides support for specifying which version of a parameter to use during update procedures. To accomplish this, you must use one of the DataRowVersion constants to specify which value of the parameter to use. For example:

ParameterDirection.Input, "id",
, null));

This would be a very common parameter for an update command since you want to update field values for an existing row specified by a given ID. Here we are defining a parameter to be equal to the original value of the "id" column. See Listing 3 for an example of a command that uses this parameter.

Improvements to MonoME
There are several ways MonoME could be improved; we'll cover two as we finish up. The first improvement is really fixing a problem. The problem is that not all commands given to MonoME will be select commands. While not a complete solution, simply checking for the word "select" at the start of the command and choosing the correct method of MySQLCommand should help.

The other improvement we'll cover is the inclusion of transactions. MySQL doesn't support transactions on all table types, but more and more servers are using the InnoDB table types, so transactions are possible. The ByteFX MySQL data provider supports transactions using the MySQLTransaction class. The usage is identical to the SqlTransaction (SQL server) class. A transaction object is created by calling _Connection.BeginTransaction. Here's how that would look:

MySQLTransaction myTrans;
// Start a local transaction
myTrans = myConnection.BeginTransaction();
// Must assign both transaction object and connection
// to Command object for a pending local transaction
myCommand.Connection = myConnection;
myCommand.Transaction = myTrans;

Note that both the connection and the transaction must be assigned to the command object before the command is executed. Once the command is executed, either a commit or rollback should be performed. Take a look:

... command text has already been assigned....
catch(MySQLException e)

Here we catch any exceptions caused by the command and perform a rollback in that case.

In the context of MonoME, multiple commands could be given at once, separated by a ";" character. Executing those commands inside a transaction would guarantee data integrity on the server.

Wrap Up
In this article, we've taken a brief look at using Mono and the ByteFX MySQL data provider to access your MySQL data. There's a lot we didn't cover and I would encourage you to investigate Mono and the different data providers that are available. To aid you in your discovery, I have intentionally left out the final code listing of MonoME. You have all the parts from the listings; they just need to be assembled and compiled. .NET is no longer coming to non-MS platforms. It's here and it is exciting!

More Stories By Reggie Burnett

Reggie Burnett is a .NET architect for MySQL, Inc. Reggie wrote the original ByteFX ADO.NET provider for MySQL and currently maintains that code under its current name of MySQL Connector/Net.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
BnkToTheFuture.com is the largest online investment platform for investing in FinTech, Bitcoin and Blockchain companies. We believe the future of finance looks very different from the past and we aim to invest and provide trading opportunities for qualifying investors that want to build a portfolio in the sector in compliance with international financial regulations.
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
Imagine if you will, a retail floor so densely packed with sensors that they can pick up the movements of insects scurrying across a store aisle. Or a component of a piece of factory equipment so well-instrumented that its digital twin provides resolution down to the micrometer.
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settle...
Product connectivity goes hand and hand these days with increased use of personal data. New IoT devices are becoming more personalized than ever before. In his session at 22nd Cloud Expo | DXWorld Expo, Nicolas Fierro, CEO of MIMIR Blockchain Solutions, will discuss how in order to protect your data and privacy, IoT applications need to embrace Blockchain technology for a new level of product security never before seen - or needed.
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
No hype cycles or predictions of a gazillion things here. IoT is here. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, an Associate Partner of Analytics, IoT & Cybersecurity at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He also discussed the evaluation of communication standards and IoT messaging protocols, data...
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
We are given a desktop platform with Java 8 or Java 9 installed and seek to find a way to deploy high-performance Java applications that use Java 3D and/or Jogl without having to run an installer. We are subject to the constraint that the applications be signed and deployed so that they can be run in a trusted environment (i.e., outside of the sandbox). Further, we seek to do this in a way that does not depend on bundling a JRE with our applications, as this makes downloads and installations rat...
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
DX World EXPO, LLC, a Lighthouse Point, Florida-based startup trade show producer and the creator of "DXWorldEXPO® - Digital Transformation Conference & Expo" has announced its executive management team. The team is headed by Levent Selamoglu, who has been named CEO. "Now is the time for a truly global DX event, to bring together the leading minds from the technology world in a conversation about Digital Transformation," he said in making the announcement.
In this strange new world where more and more power is drawn from business technology, companies are effectively straddling two paths on the road to innovation and transformation into digital enterprises. The first path is the heritage trail – with “legacy” technology forming the background. Here, extant technologies are transformed by core IT teams to provide more API-driven approaches. Legacy systems can restrict companies that are transitioning into digital enterprises. To truly become a lead...
Digital Transformation (DX) is not a "one-size-fits all" strategy. Each organization needs to develop its own unique, long-term DX plan. It must do so by realizing that we now live in a data-driven age, and that technologies such as Cloud Computing, Big Data, the IoT, Cognitive Computing, and Blockchain are only tools. In her general session at 21st Cloud Expo, Rebecca Wanta explained how the strategy must focus on DX and include a commitment from top management to create great IT jobs, monitor ...
"Cloud Academy is an enterprise training platform for the cloud, specifically public clouds. We offer guided learning experiences on AWS, Azure, Google Cloud and all the surrounding methodologies and technologies that you need to know and your teams need to know in order to leverage the full benefits of the cloud," explained Alex Brower, VP of Marketing at Cloud Academy, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clar...
The IoT Will Grow: In what might be the most obvious prediction of the decade, the IoT will continue to expand next year, with more and more devices coming online every single day. What isn’t so obvious about this prediction: where that growth will occur. The retail, healthcare, and industrial/supply chain industries will likely see the greatest growth. Forrester Research has predicted the IoT will become “the backbone” of customer value as it continues to grow. It is no surprise that retail is ...