Welcome!

.NET Authors: Elizabeth White, Jayaram Krishnaswamy, Sematext Blog, ITinvolve Blog, Aditya Banerjee

Blog Feed Post

31 Days of Servers in the Cloud – Move a local VM to the Cloud (Part 5 of 31)

VMs up, up, and away!My turn!

In todays installment of our “31 Days of Servers in the Cloud”, we wanted to show you how easy it is to load a locally created, Hyper-V based virtual machine into Windows Azure.

“But it’s not really that easy, is it?  I’ve had a heckuva time trying to make this work!”

Actually, once the preliminaries are in place, it is easy.  But to upload anything from your local machine into a Windows Azure storage account requires you to connect to your Azure account.. which means having a management certificate in place to authenticate the connection.. which is a process that is hard to discover.  Searching for a quick solution was confusing, because the tools are always changing.. and what was required several months ago isn’t necessarily the easiest way to do this.

This leads me to a little disclaimer, which really could apply to every single article written for this series:

This documentation provided is based on current tools as they exist during the Windows Azure Virtual Machine PREVIEW period.  Capabilities and operations are subject to change without notice prior to the release and general availability of these new features. 

That said, I’m going to try to make this process as simple as possible, and leave you not only with the ability to launch a VM from your own uploaded .VHD (virtual hard disk) file, but also leave you in good shape for using some pretty useful tools (such as Windows PowerShell) for managing your Windows Azure-based resources. 

The rest of this article assumes that you already have a Windows Azure subscription.  If you don’t have one, you can start a FREE 90 TRIAL HERE.

Create a local VM using Hyper-V

I’m going to assume that you know how to use Hyper-V to create a virtual machine.  You can do this in Hyper-V running on Windows Server 2008 R2 or Windows Server 2012.  You could even use Hyper-V installed on Windows 8.  The end result should be that you have a virtual machine installed as you want it, sysprepped (important!), and ready to go.  It’s that machine’s .VHD (the virtual hard disk) file that you’re going to be uploading into Windows Azure storage.

If you want further help building and preparing a virtual machine, check out the first part of this article on how to build a VM: Creating and Uploading a Virtual Hard Disk that Contains the Windows Server Operating System

NOTE: If you’re going to use one of the storage exploring tools I will be mentioning later, you will want to create your disk as (or convert your disk to) a fixed-format VHD.  This is because those tools won’t convert the disk file on the fly, and the disk in Windows Azure storage is required to be a fixed disk (as opposed to a dynamic disk, which is the default). 

Setup Windows Azure Management

Before we can connect to our Windows Azure storage and start uploading, we need to have a management certificate in place, as well as the tools for doing the upload installed.

Although there are manual ways of creating and uploading a self-signed certificate, the easiest method is to use the Windows Azure PowerShell cmdlets.  Here is the download location for those:

Windows Azure PowerShell: https://www.windowsazure.com/en-us/manage/downloads/ 

Note that although the page says that it’s the November 2012 release, it actually gives you the December 2012 release.  That’s important, because the extremely beneficial Add-AzureVHD PowerShell cmdlet was only introduced in December.

Once those are installed, you can follow the instructions here:

Get Started with Windows Azure Cmdlets: http://msdn.microsoft.com/en-us/library/windowsazure/jj554332.aspx

Specifically THIS SECTION which describes how to use the Get-AzurePublishSettingsFile, which generates a certificate in Windows Azure and creates a local “.publishsettings” file that is then imported locally using the Import-AzurePublishSettingsFile cmdlet.  Once that’s done, you’ll have the management certificate in place locally as well as in your Azure account.  And the best part is, this relationship is persistent!  From this point on the opening of the Windows Azure PowerShell window will be properly associated with your account. 

For a really great write-up on setting up and using PowerShell for Windows Azure, check out Michael Washam’s excellent article HERE.

Create an Azure Storage Account

If you have already created a virtual machine in Windows Azure, then you already have a storage account and container that you can use to hold your disks.  But if you haven’t already done this, you will want to go into your portal and create one.

At the bottom of the portal, click “+ New”, and then choose Data Services –> Storage –> Quick Create

image

You’ll give your storage a unique name and choose geographical location, and then create it.

Once it’s created, select the new storage account and create a new “Blob Container” by selecting the CONTAINERS tab, and then clicking “CREATE A BLOB CONTAINER”.

image

image

image

Note the URL.  Copy it to the clipboard or otherwise keep it handy.  This URL will be used when we upload our VHD.

Upload the Hard Disk into Windows Azure Storage Container

“Kevin..  you also mentioned that we’ll need some tool to do the actual uploads.”

That’s right.  Until recently, the only tool provided by Microsoft for doing this is the “csupload” tool, which is a commandline utility that is installed with the Windows Azure SDK.  (Windows Azure Tools: http://www.windowsazure.com/en-us/develop/downloads/ – But don’t install it just yet… it installs much more than you need to complete this exercise.)

Once the SDK is installed, and you have the SubscriptionID and the Certificate Thumbprint for your connection, you open the Windows Azure Command Prompt and use the csupload command in two steps: to setup the connection, and to do the upload.  Here is the text from the article, Creating and Uploading a Virtual Hard Disk that Contains the Windows Server Operating System , which describes how to use the csupload tool.

All that said… DON’T DO IT!  Unless you’re a developer, the Windows Azure SDK is much more than you need!

“So what’s the alternative, Kevin?”

PowerShell!  Yes.. you already have the PowerShell for Windows Azure installed, so now you’re going to use two PowerShell CmdLets: Add-AzureVHD and Add-AzureDisk

Add-AzureVHD is the upload.  This is the one that takes a LONG TIME to run (depending on the size of your .VHD and your upstream connection speed).  The result is that you have a new Page Blob object up in your storage.

Add-AzureDisk essentially tells Windows Azure to treat that new blob as a .VHD file that has a bootable operating system in it.  Once that’s done, you can go into the Windows Azure Portal, create a new machine, and see your disk as one of the machine disks available.

So in my example, with a fresh, sysprepped, fixed-disk (10GB) .VHD installation of Windows Server 2012, I run these two commands:

Add-AzureVhd -Destination http://kevremdiskstorage.blob.core.windows.net/mydisks/SmallTestServer.vhd -LocalFilePath d:\SmallTestServer.vhd

Add-AzureDisk -DiskName SmallTestServer -MediaLocation http://kevremdiskstorage.blob.core.windows.net/mydisks/SmallTestServer.vhd -OS Windows

(Of course, the first one takes quite a while for me.  About 13 hours.  Ugh.)

“Hey Kevin.. what if I want to use and re-use that image as the basis for multiple machines?”

Excellent question!  And the good news is that basically instead of using Add-AzureDisk, you use the Add-AzureVMImage CmdLet to tell Windows Azure that the disk should be made available as a re-usable image.  Like this:

Add-AzureVMImage -ImageName Server2012Eval -MediaLocation http://kevremdiskstorage.blob.core.windows.net/mydisks/SmallTestServer.vhd -OS Windows

Once that’s done, instead of just having a disk to use once for a new machine, I have a starting-point for one or more machines.

Create the Machine

In the portal it’s really no more complex than creating a new machine from the gallery:

image

Your disk should show up towards the bottom of the list.  Select it, and build your machine.

Once created, you should be able to start it as if it were any other machine built from a previoulsy installed disk.

If you chose to add your disk as an image in the repository, then you also could create it using QUICK CREATE, because it is an image that is now available for you to use and re-use.

---

Other Errata

As long as we’re discussing working with Windows Azure Storage, here are a couple of tools that make it easier to manage, navigate, and upload/download items in your storage cloud:

Both have free trials, and aren’t really all that expensive.  I’ve had mixed results, and you have to be careful that you’re creating “page blobs” and not “block blobs”.  And with a slow upload connection, these tools are rather fragile.  Benefit –  Both of these allow you to configure a connection to your Windows Azure subscription and multiple storage accounts in order to upload and download your .VHD files.  For our purposes, these will do what the Add-AzureVHD cmdlet did for us, plus let you create or manage storage containers.  You’ll still need to run the Add-AzureDisk and Add-AzureVMImage commands to configure your disks for use.

(Major kudos to Joerg of ClumsyLeaf Software (makers of CloudXplorer), who answered my support questions in a matter of minutes!  And on a Saturday, no less!)

---

What do you think?  Are you going to try this out?  At the very least I hope that this article helps you get PowerShell configured for working with your Windows Azure objects.  Give us your questions or feedback in the comments.

Read the original blog entry...

More Stories By Kevin Remde

Kevin is an engaging and highly sought-after speaker and webcaster who has landed several times on Microsoft's top 10 webcast list, and has delivered many top-scoring TechNet events and webcasts. In his past outside of Microsoft, Kevin has held positions such as software engineer, information systems professional, and information systems manager. He loves sharing helpful new solutions and technologies with his IT professional peers.

A prolific blogger, Kevin shares his thoughts, ideas and tips on his “Full of I.T.” blog (http://aka.ms/FullOfIT). He also contributes to and moderates the TechNet Forum IT Manager discussion (http://aka.ms/ITManager), and presents live TechNet Events throughout the central U.S. (http://www.technetevents.com). When he's not busy learning or blogging about new technologies, Kevin enjoys digital photography and videography, and sings in a band. (Q: Midlife crisis? A: More cowbell!) He continues to challenge his TechNet Event audiences to sing Karaoke with him.

@ThingsExpo Stories
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Media announced that Splunk, a provider of the leading software platform for real-time Operational Intelligence, has launched an ad campaign on Big Data Journal. Splunk software and cloud services enable organizations to search, monitor, analyze and visualize machine-generated big data coming from websites, applications, servers, networks, sensors and mobile devices. The ads focus on delivering ROI - how improved uptime delivered $6M in annual ROI, improving customer operations by mining large volumes of unstructured data, and how data tracking delivers uptime when it matters most.
DevOps Summit 2015 New York, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.
Wearable devices have come of age. The primary applications of wearables so far have been "the Quantified Self" or the tracking of one's fitness and health status. We propose the evolution of wearables into social and emotional communication devices. Our BE(tm) sensor uses light to visualize the skin conductance response. Our sensors are very inexpensive and can be massively distributed to audiences or groups of any size, in order to gauge reactions to performances, video, or any kind of presentation. In her session at @ThingsExpo, Jocelyn Scheirer, CEO & Founder of Bionolux, will discuss ho...
“With easy-to-use SDKs for Atmel’s platforms, IoT developers can now reap the benefits of realtime communication, and bypass the security pitfalls and configuration complexities that put IoT deployments at risk,” said Todd Greene, founder & CEO of PubNub. PubNub will team with Atmel at CES 2015 to launch full SDK support for Atmel’s MCU, MPU, and Wireless SoC platforms. Atmel developers now have access to PubNub’s secure Publish/Subscribe messaging with guaranteed ¼ second latencies across PubNub’s 14 global points-of-presence. PubNub delivers secure communication through firewalls, proxy ser...
We’re no longer looking to the future for the IoT wave. It’s no longer a distant dream but a reality that has arrived. It’s now time to make sure the industry is in alignment to meet the IoT growing pains – cooperate and collaborate as well as innovate. In his session at @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, will examine the key ingredients to IoT success and identify solutions to challenges the industry is facing. The deep industry expertise behind this presentation will provide attendees with a leading edge view of rapidly emerging IoT oppor...
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics is e...
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
"There is a natural synchronization between the business models, the IoT is there to support ,” explained Brendan O'Brien, Co-founder and Chief Architect of Aria Systems, in this SYS-CON.tv interview at the 15th International Cloud Expo®, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges. In his session at @ThingsExpo, Jeff Kaplan, Managing Director of THINKstrategies, will examine why IT must finally fulfill its role in support of its SBUs or face a new round of...
The BPM world is going through some evolution or changes where traditional business process management solutions really have nowhere to go in terms of development of the road map. In this demo at 15th Cloud Expo, Kyle Hansen, Director of Professional Services at AgilePoint, shows AgilePoint’s unique approach to dealing with this market circumstance by developing a rapid application composition or development framework.

ARMONK, N.Y., Nov. 20, 2014 /PRNewswire/ --  IBM (NYSE: IBM) today announced that it is bringing a greater level of control, security and flexibility to cloud-based application development and delivery with a single-tenant version of Bluemix, IBM's platform-as-a-service. The new platform enables developers to build ap...

An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and assessments, including a decade of leading incident response and digital forensics. He is co-author of t...
Building low-cost wearable devices can enhance the quality of our lives. In his session at Internet of @ThingsExpo, Sai Yamanoor, Embedded Software Engineer at Altschool, provided an example of putting together a small keychain within a $50 budget that educates the user about the air quality in their surroundings. He also provided examples such as building a wearable device that provides transit or recreational information. He then reviewed the resources available to build wearable devices at home including open source hardware, the raw materials required and the options available to power s...
The Internet of Things is not new. Historically, smart businesses have used its basic concept of leveraging data to drive better decision making and have capitalized on those insights to realize additional revenue opportunities. So, what has changed to make the Internet of Things one of the hottest topics in tech? In his session at @ThingsExpo, Chris Gray, Director, Embedded and Internet of Things, discussed the underlying factors that are driving the economics of intelligent systems. Discover how hardware commoditization, the ubiquitous nature of connectivity, and the emergence of Big Data a...
The Internet of Things promises to transform businesses (and lives), but navigating the business and technical path to success can be difficult to understand. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, demonstrated how to approach creating broadly successful connected customer solutions using real world business transformation studies including New England BioLabs and more.
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core of our infrastructures. At the same time, we have old approaches made new again like micro-services...
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can't be addressed w...
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What's driving t...