Welcome!

Microsoft Cloud Authors: Pat Romanski, Lori MacVittie, Andreas Grabner, Jim Kaskade, John Basso

Related Topics: Microsoft Cloud, Containers Expo Blog, Silverlight, Agile Computing, @CloudExpo, Cloud Security

Microsoft Cloud: Blog Post

Step-by-Step: Tired of Tapes? Backup SQL Databases to the Cloud

New feature in SQL Server 2012 SP1 CU2 provides native backup to Windows Azure cloud storage

I think every IT Pro I’ve ever met hates tape backups … but having an offsite component in your backup strategy is absolutely necessary for effective disaster recovery.  One of the new features provided in SQL Server 2012 Service Pack 1 Cumulative Update 2 is the ability to now backup SQL databases and logs to Windows Azure cloud storage using native SQL Server Backup via both Transact-SQL (T-SQL) and SQL Server Management Objects (SMO).

Backup to cloud storage is a natural fit for disaster recovery, as our backups are instantly located offsite when completed.  And, the pay-as-you-go model of cloud storage economics makes it really cost effective – Windows Azure storage costs are less than $100/TB per month for geo-redundant storage based on current published costs as of this article’s date.  That’s less than the cost of a couple SDLT tapes! You can check out our current pricing model for Windows Azure Storage on our Price Calculator page.

In this article, I’ll step through the process of using SQL Server 2012 SP1 CU2 native backup capabilities to create database backups on Windows Azure cloud storage.

How do I get started?
To get started, you’ll need a Windows Azure subscription.  Good news! You can get a FREE 90-Day Windows Azure subscription to follow along with this article, evaluate and test … this subscription is 100% free for 90-days and there’s absolutely no obligation to convert to a paid subscription.

  • DO IT: Sign-up for a Free 90-Day Windows Azure Subscription

    NOTE: When activating your FREE 90-Day Subscription for Windows Azure, you will be prompted for credit card information.  This information is used only to validate your identity and your credit card will not be charged, unless you explicitly convert your FREE Trial account to a paid subscription at a later point in time.

You’ll also need to download Cumulative Update 2 for SQL Server 2012 Service Pack 1 and apply that to the SQL Server instance with which you’ll be testing.

Don’t have a SQL Server 2012 instance in your data center that you can test with?  No problem! You can spin up a SQL Server 2012 VM in the Windows Azure Cloud using your Free 90-Day subscription.

Let’s grab some cloud storage
Once you’ve got your Windows Azure subscription activated and your SQL Server 2012 lab environment patched with SP1 CU2, you’re ready to provision some cloud storage that can be used as a backup location for SQL databases …

  1. Launch the Windows Azure Management Portal and login with the credentials used when activating your FREE 90-Day Subscription above.
  2. Click Storage in the left navigation pane of the Windows Azure Management Portal.

    image
    Windows Azure Management Portal – Storage Accounts
  3. On the Storage page of the Windows Azure Management Portal, click +NEW on the bottom toolbar to create a new storage account location.

    image
    Creating a new Windows Azure Storage Account location
  4. Click Quick Create on the New > Storage popup menu and complete the fields as listed below:

    - URL: XXXbackup01 ( where XXX represents your initials in lowercase )

    - Region / Affinity Group: Select an available Windows Azure datacenter region for your new Storage Account. 

    NOTE: Because you will be using this Storage Account location for backup / disaster recovery scenarios, be sure to select a Datacenter Region that is not near to you for additional protection against disasters that may affect your entire local area.

    Click the Create Storage Account button to create your new Storage Account location.
  5. Wait for your new Storage Account to be provisioned. 

    image
    Provisioning new Windows Azure Storage Account

    Once the status of your new Storage Account shows as Online, you may continue with the next step.
  6. Select your newly created Storage Account and click the Manage Keys button on the bottom toolbar to display the Manage Access Keys dialog box.

    image
    Manage Access Keys dialog box

    Click the image button located next to the Secondary Access Key field to copy this access key to your clipboard for later use.
  7. Create a container within your Windows Azure Storage Account to store backups.  Click on the name of your Storage Account on the Storage page in the Windows Azure Management Portal to drill into the details of this account, then select the Containers tab located at the top of the page.

    image
    Containers tab within a Windows Azure Storage Account

    On the bottom toolbar, click the Add Container button to create a new container named “backups”.

You’ve now completed the provisioning of your new Windows Azure storage account location.

We’re ready to backup to the cloud
When you’re ready to test a SQL database backup to the cloud, launch SQL Server Management Studio and connect to your SQL Server 2012 SP1 CU2 database engine instance.  After you’ve done this, proceed with the following steps to complete a backup …

  1. In SQL Server Management Studio, right-click on the database you wish to backup in the Object Explorer list pane and select New Query.

    image
    SQL Server Management Studio
  2. In the new SQL Query Window, execute the following Transact-SQL code to create a credential that can be used to authenticate to your Windows Azure Storage Account with secure read/write access:

    CREATE CREDENTIAL myAzureCredential
    WITH IDENTITY='XXXbackup01',
    SECRET=’PASTE IN YOUR COPIED ACCESS KEY HERE';

    Prior to running this code, be sure to replace XXXbackup01 with the name of your Windows Azure Storage Account created above and paste in the Access Key you previously copied to your clipboard.

  3. In the SQL Query Window, execute the following Transact-SQL code to perform the database backup to your Windows Azure Storage Account:

    BACKUP DATABASE database_name TO
    URL='https://XXXbackup01.blob.core.windows.net/backups/database_name.bak'
    WITH CREDENTIAL='myAzureCredential' , STATS = 5;


    Prior to running this code, be sure to replace XXXbackup01 with the name of your Windows Azure Storage Account and replace database_name with the name of your database.

    Upon successful execution of the backup, you should see SQL Query result messages similar to the following:

    image
    Successful Backup Results

What about restoring?
Restoring from the cloud is just as easy as backing up … to restore we can use the following Transact-SQL syntax:

RESTORE DATABASE database_name FROM
URL='https://XXXbackup01.blob.core.windows.net/backups/database_name.bak'
WITH CREDENTIAL=’myAzureCredential’, STATS = 5, REPLACE

Thoughts? Comments? Feedback?
What are your thoughts around leveraging the cloud for backup storage? Feel free to post your comments, questions and feedback below.

-Keith

Build Your Lab! Build Your Lab! Download Windows Server 2012
Build Your Lab in the Cloud! Don’t Have a Lab? Build Your Lab in the Cloud with Windows Azure Virtual Machines
Join our "Early Experts" study group! Want to Get Certified? Join our Windows Server 2012 "Early Experts" Study Group

More Stories By Keith Mayer

Keith Mayer is a Technical Evangelist at Microsoft focused on Windows Infrastructure, Data Center Virtualization, Systems Management and Private Cloud. Keith has over 17 years of experience as a technical leader of complex IT projects, in diverse roles, such as Network Engineer, IT Manager, Technical Instructor and Consultant. He has consulted and trained thousands of IT professionals worldwide on the design and implementation of enterprise technology solutions.

Keith is currently certified on several Microsoft technologies, including System Center, Hyper-V, Windows, Windows Server, SharePoint and Exchange. He also holds other industry certifications from IBM, Cisco, Citrix, HP, CheckPoint, CompTIA and Interwoven.

Keith is the author of the IT Pros ROCK! Blog on Microsoft TechNet, voted as one of the Top 50 "Must Read" IT Blogs.

Keith also manages the Windows Server 2012 "Early Experts" Challenge - a FREE online study group for IT Pros interested in studying and preparing for certification on Windows Server 2012. Join us and become the next "Early Expert"!

@ThingsExpo Stories
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
Businesses and business units of all sizes can benefit from cloud computing, but many don't want the cost, performance and security concerns of public cloud nor the complexity of building their own private clouds. Today, some cloud vendors are using artificial intelligence (AI) to simplify cloud deployment and management. In his session at 20th Cloud Expo, Ajay Gulati, Co-founder and CEO of ZeroStack, will discuss how AI can simplify cloud operations. He will cover the following topics: why clou...
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
Everyone knows that truly innovative companies learn as they go along, pushing boundaries in response to market changes and demands. What's more of a mystery is how to balance innovation on a fresh platform built from scratch with the legacy tech stack, product suite and customers that continue to serve as the business' foundation. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, discussed why and how ReadyTalk diverted from healthy revenue and mor...
The many IoT deployments around the world are busy integrating smart devices and sensors into their enterprise IT infrastructures. Yet all of this technology – and there are an amazing number of choices – is of no use without the software to gather, communicate, and analyze the new data flows. Without software, there is no IT. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Dave McCarthy, Director of Products at Bsquare Corporation; Alan Williamson, Principal...
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
"IoT is going to be a huge industry with a lot of value for end users, for industries, for consumers, for manufacturers. How can we use cloud to effectively manage IoT applications," stated Ian Khan, Innovation & Marketing Manager at Solgeniakhela, in this SYS-CON.tv interview at @ThingsExpo, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Successful digital transformation requires new organizational competencies and capabilities. Research tells us that the biggest impediment to successful transformation is human; consequently, the biggest enabler is a properly skilled and empowered workforce. In the digital age, new individual and collective competencies are required. In his session at 19th Cloud Expo, Bob Newhouse, CEO and founder of Agilitiv, drew together recent research and lessons learned from emerging and established compa...
Financial Technology has become a topic of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 20th Cloud Expo at the Javits Center in New York, June 6-8, 2017, will find fresh new content in a new track called FinTech.
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O'Neill, Strategic Business Development at Nvidia, discussed the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to "every day," on-demand. They also reviewed two "free infrastructure" pr...
"Dice has been around for the last 20 years. We have been helping tech professionals find new jobs and career opportunities," explained Manish Dixit, VP of Product and Engineering at Dice, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
"At ROHA we develop an app called Catcha. It was developed after we spent a year meeting with, talking to, interacting with senior citizens watching them use their smartphones and talking to them about how they use their smartphones so we could get to know their smartphone behavior," explained Dave Woods, Chief Innovation Officer at ROHA, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2017 New York. The 20th Cloud Expo and 7th @ThingsExpo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Internet to enable us all to im...
We are always online. We access our data, our finances, work, and various services on the Internet. But we live in a congested world of information in which the roads were built two decades ago. The quest for better, faster Internet routing has been around for a decade, but nobody solved this problem. We’ve seen band-aid approaches like CDNs that attack a niche's slice of static content part of the Internet, but that’s it. It does not address the dynamic services-based Internet of today. It does...