Click here to close now.


Microsoft Cloud Authors: Jayaram Krishnaswamy, Elizabeth White, Andreas Grabner, Jim Kaskade, Pat Romanski

Blog Feed Post

31 Days of Servers in the Cloud – Move a local VM to the Cloud (Part 5 of 31)

VMs up, up, and away!My turn!

In todays installment of our “31 Days of Servers in the Cloud”, we wanted to show you how easy it is to load a locally created, Hyper-V based virtual machine into Windows Azure.

“But it’s not really that easy, is it?  I’ve had a heckuva time trying to make this work!”

Actually, once the preliminaries are in place, it is easy.  But to upload anything from your local machine into a Windows Azure storage account requires you to connect to your Azure account.. which means having a management certificate in place to authenticate the connection.. which is a process that is hard to discover.  Searching for a quick solution was confusing, because the tools are always changing.. and what was required several months ago isn’t necessarily the easiest way to do this.

This leads me to a little disclaimer, which really could apply to every single article written for this series:

This documentation provided is based on current tools as they exist during the Windows Azure Virtual Machine PREVIEW period.  Capabilities and operations are subject to change without notice prior to the release and general availability of these new features. 

That said, I’m going to try to make this process as simple as possible, and leave you not only with the ability to launch a VM from your own uploaded .VHD (virtual hard disk) file, but also leave you in good shape for using some pretty useful tools (such as Windows PowerShell) for managing your Windows Azure-based resources. 

The rest of this article assumes that you already have a Windows Azure subscription.  If you don’t have one, you can start a FREE 90 TRIAL HERE.

Create a local VM using Hyper-V

I’m going to assume that you know how to use Hyper-V to create a virtual machine.  You can do this in Hyper-V running on Windows Server 2008 R2 or Windows Server 2012.  You could even use Hyper-V installed on Windows 8.  The end result should be that you have a virtual machine installed as you want it, sysprepped (important!), and ready to go.  It’s that machine’s .VHD (the virtual hard disk) file that you’re going to be uploading into Windows Azure storage.

If you want further help building and preparing a virtual machine, check out the first part of this article on how to build a VM: Creating and Uploading a Virtual Hard Disk that Contains the Windows Server Operating System

NOTE: If you’re going to use one of the storage exploring tools I will be mentioning later, you will want to create your disk as (or convert your disk to) a fixed-format VHD.  This is because those tools won’t convert the disk file on the fly, and the disk in Windows Azure storage is required to be a fixed disk (as opposed to a dynamic disk, which is the default). 

Setup Windows Azure Management

Before we can connect to our Windows Azure storage and start uploading, we need to have a management certificate in place, as well as the tools for doing the upload installed.

Although there are manual ways of creating and uploading a self-signed certificate, the easiest method is to use the Windows Azure PowerShell cmdlets.  Here is the download location for those:

Windows Azure PowerShell: 

Note that although the page says that it’s the November 2012 release, it actually gives you the December 2012 release.  That’s important, because the extremely beneficial Add-AzureVHD PowerShell cmdlet was only introduced in December.

Once those are installed, you can follow the instructions here:

Get Started with Windows Azure Cmdlets:

Specifically THIS SECTION which describes how to use the Get-AzurePublishSettingsFile, which generates a certificate in Windows Azure and creates a local “.publishsettings” file that is then imported locally using the Import-AzurePublishSettingsFile cmdlet.  Once that’s done, you’ll have the management certificate in place locally as well as in your Azure account.  And the best part is, this relationship is persistent!  From this point on the opening of the Windows Azure PowerShell window will be properly associated with your account. 

For a really great write-up on setting up and using PowerShell for Windows Azure, check out Michael Washam’s excellent article HERE.

Create an Azure Storage Account

If you have already created a virtual machine in Windows Azure, then you already have a storage account and container that you can use to hold your disks.  But if you haven’t already done this, you will want to go into your portal and create one.

At the bottom of the portal, click “+ New”, and then choose Data Services –> Storage –> Quick Create


You’ll give your storage a unique name and choose geographical location, and then create it.

Once it’s created, select the new storage account and create a new “Blob Container” by selecting the CONTAINERS tab, and then clicking “CREATE A BLOB CONTAINER”.




Note the URL.  Copy it to the clipboard or otherwise keep it handy.  This URL will be used when we upload our VHD.

Upload the Hard Disk into Windows Azure Storage Container

“Kevin..  you also mentioned that we’ll need some tool to do the actual uploads.”

That’s right.  Until recently, the only tool provided by Microsoft for doing this is the “csupload” tool, which is a commandline utility that is installed with the Windows Azure SDK.  (Windows Azure Tools: – But don’t install it just yet… it installs much more than you need to complete this exercise.)

Once the SDK is installed, and you have the SubscriptionID and the Certificate Thumbprint for your connection, you open the Windows Azure Command Prompt and use the csupload command in two steps: to setup the connection, and to do the upload.  Here is the text from the article, Creating and Uploading a Virtual Hard Disk that Contains the Windows Server Operating System , which describes how to use the csupload tool.

All that said… DON’T DO IT!  Unless you’re a developer, the Windows Azure SDK is much more than you need!

“So what’s the alternative, Kevin?”

PowerShell!  Yes.. you already have the PowerShell for Windows Azure installed, so now you’re going to use two PowerShell CmdLets: Add-AzureVHD and Add-AzureDisk

Add-AzureVHD is the upload.  This is the one that takes a LONG TIME to run (depending on the size of your .VHD and your upstream connection speed).  The result is that you have a new Page Blob object up in your storage.

Add-AzureDisk essentially tells Windows Azure to treat that new blob as a .VHD file that has a bootable operating system in it.  Once that’s done, you can go into the Windows Azure Portal, create a new machine, and see your disk as one of the machine disks available.

So in my example, with a fresh, sysprepped, fixed-disk (10GB) .VHD installation of Windows Server 2012, I run these two commands:

Add-AzureVhd -Destination -LocalFilePath d:\SmallTestServer.vhd

Add-AzureDisk -DiskName SmallTestServer -MediaLocation -OS Windows

(Of course, the first one takes quite a while for me.  About 13 hours.  Ugh.)

“Hey Kevin.. what if I want to use and re-use that image as the basis for multiple machines?”

Excellent question!  And the good news is that basically instead of using Add-AzureDisk, you use the Add-AzureVMImage CmdLet to tell Windows Azure that the disk should be made available as a re-usable image.  Like this:

Add-AzureVMImage -ImageName Server2012Eval -MediaLocation -OS Windows

Once that’s done, instead of just having a disk to use once for a new machine, I have a starting-point for one or more machines.

Create the Machine

In the portal it’s really no more complex than creating a new machine from the gallery:


Your disk should show up towards the bottom of the list.  Select it, and build your machine.

Once created, you should be able to start it as if it were any other machine built from a previoulsy installed disk.

If you chose to add your disk as an image in the repository, then you also could create it using QUICK CREATE, because it is an image that is now available for you to use and re-use.


Other Errata

As long as we’re discussing working with Windows Azure Storage, here are a couple of tools that make it easier to manage, navigate, and upload/download items in your storage cloud:

Both have free trials, and aren’t really all that expensive.  I’ve had mixed results, and you have to be careful that you’re creating “page blobs” and not “block blobs”.  And with a slow upload connection, these tools are rather fragile.  Benefit –  Both of these allow you to configure a connection to your Windows Azure subscription and multiple storage accounts in order to upload and download your .VHD files.  For our purposes, these will do what the Add-AzureVHD cmdlet did for us, plus let you create or manage storage containers.  You’ll still need to run the Add-AzureDisk and Add-AzureVMImage commands to configure your disks for use.

(Major kudos to Joerg of ClumsyLeaf Software (makers of CloudXplorer), who answered my support questions in a matter of minutes!  And on a Saturday, no less!)


What do you think?  Are you going to try this out?  At the very least I hope that this article helps you get PowerShell configured for working with your Windows Azure objects.  Give us your questions or feedback in the comments.

Read the original blog entry...

More Stories By Kevin Remde

Kevin is an engaging and highly sought-after speaker and webcaster who has landed several times on Microsoft's top 10 webcast list, and has delivered many top-scoring TechNet events and webcasts. In his past outside of Microsoft, Kevin has held positions such as software engineer, information systems professional, and information systems manager. He loves sharing helpful new solutions and technologies with his IT professional peers.

A prolific blogger, Kevin shares his thoughts, ideas and tips on his “Full of I.T.” blog ( He also contributes to and moderates the TechNet Forum IT Manager discussion (, and presents live TechNet Events throughout the central U.S. ( When he's not busy learning or blogging about new technologies, Kevin enjoys digital photography and videography, and sings in a band. (Q: Midlife crisis? A: More cowbell!) He continues to challenge his TechNet Event audiences to sing Karaoke with him.

@ThingsExpo Stories
SYS-CON Events announced today that Luxoft Holding, Inc., a leading provider of software development services and innovative IT solutions, has been named “Bronze Sponsor” of SYS-CON's @ThingsExpo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Luxoft’s software development services consist of core and mission-critical custom software development and support, product engineering and testing, and technology consulting.
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the cloud and the best price/performance value available. ProfitBricks was named one of the coolest Clo...
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated and cloud solutions through hybrid hosting – a sustainable solution for the data required to manage I...
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, will look at different existing uses of peer-to-peer data sharing and how it can become useful in a live session to...
NHK, Japan Broadcasting, will feature the upcoming @ThingsExpo Silicon Valley in a special 'Internet of Things' and smart technology documentary that will be filmed on the expo floor between November 3 to 5, 2015, in Santa Clara. NHK is the sole public TV network in Japan equivalent to the BBC in the UK and the largest in Asia with many award-winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology and will be covering @ThingsExpo Silicon Valley. The program, to be aired during the peak viewership season of the year, will have a major impac...
Mobile messaging has been a popular communication channel for more than 20 years. Finnish engineer Matti Makkonen invented the idea for SMS (Short Message Service) in 1984, making his vision a reality on December 3, 1992 by sending the first message ("Happy Christmas") from a PC to a cell phone. Since then, the technology has evolved immensely, from both a technology standpoint, and in our everyday uses for it. Originally used for person-to-person (P2P) communication, i.e., Sally sends a text message to Betty – mobile messaging now offers tremendous value to businesses for customer and empl...
Scott Guthrie's keynote presentation "Journey to the intelligent cloud" is a must view video. This is from AzureCon 2015, September 29, 2015 I have reproduced some screen shots in case you are unable to view this long video for one reason or another. One of the highlights is 3 datacenters coming on line in India.
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target - this makes the integration of these separate pipelines and the coordination of software upd...
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Bradley Holt, Developer Advocate at IBM Cloud Data Services, will demonstrate techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, faster user experience, both offline and online. The focus of this talk will be on IBM Cloudant, Apa...
The enterprise is being consumerized, and the consumer is being enterprised. Moore's Law does not matter anymore, the future belongs to business virtualization powered by invisible service architecture, powered by hyperscale and hyperconvergence, and facilitated by vertical streaming and horizontal scaling and consolidation. Both buyers and sellers want instant results, and from paperwork to paperless to mindless is the ultimate goal for any seamless transaction. The sweetest sweet spot in innovation is automation. The most painful pain point for any business is the mismatch between supplies a...
As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...
As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of streaming data in the cloud with an enterprise grade SLA. It features built-in integration with Azur...
WebRTC: together these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at WebRTC Summit, Cary Bran, VP of Innovation and New Ventures at Plantronics and PLT Labs, will provide an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it may enable, complement or entirely transform.
WebRTC services have already permeated corporate communications in the form of videoconferencing solutions. However, WebRTC has the potential of going beyond and catalyzing a new class of services providing more than calls with capabilities such as mass-scale real-time media broadcasting, enriched and augmented video, person-to-machine and machine-to-machine communications. In his session at @ThingsExpo, Luis Lopez, CEO of Kurento, will introduce the technologies required for implementing these ideas and some early experiments performed in the Kurento open source software community in areas ...
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, will discuss the impact of technology on identity. Should we federate, or not? How should identity be secured? Who owns the identity? How is identity ...
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
WebRTC converts the entire network into a ubiquitous communications cloud thereby connecting anytime, anywhere through any point. In his session at WebRTC Summit,, Mark Castleman, EIR at Bell Labs and Head of Future X Labs, will discuss how the transformational nature of communications is achieved through the democratizing force of WebRTC. WebRTC is doing for voice what HTML did for web content.