Welcome!

.NET Authors: Jayaram Krishnaswamy, Elizabeth White, Sematext Blog, ITinvolve Blog, Aditya Banerjee

Related Topics: .NET

.NET: Article

Taking Backup and Recovery Management Off IT’s To-Do List

Backup inventory plan maximizes value and efficiency

The inability to recover mission-critical information promptly can spell disaster for a company in today's 24x7 business environment. IDC estimates that server downtime cost organizations roughly $140 billion worldwide in lost worker productivity and revenue in 2007.

Much harder to quantify yet equally important to consider are the opportunity costs associated with failed and delayed recovery of vital applications, databases, and servers. Stringent business requirements and reduced budgets continue to place significant demands on the recovery of data.

Yet, like the property owner who only realizes the importance of homeowner insurance in the wake of a destructive fire, organizations often recognize the value of backup and recovery to the business only after an outage that interrupts operations and results in lost data.

Clearly, this reactive approach must be replaced with a proactive strategy that attaches much greater significance to the role of backup and recovery. By understanding and addressing common backup and recovery issues and implementing the most appropriate solution for their organization, companies can reduce the risk associated with unmet recovery objectives, data loss, and business disruption.

Better Backup

The most important step in maximizing the value of a backup and recovery operation is the development of a backup inventory plan.   This step is meant to catalogue all data in the enterprise and to determine the importance to the organization both in speed of recoverability and importance in ensuring it can be restored effectively.  After all, impressive backup success rates are of little consequence unless the right data is being backed up. As a result, organizations must be sure that they are preserving their most essential data, such as a customer database that is accessed and modified frequently every day.

To make sure they are backing up the right data correctly, IT should develop a tiered list of backup and recovery requirements that is aligned with the company's business needs. Once this service catalog is established, it should be reviewed and revisited regularly.

Once the service catalogue is completed, the respective costs of the organization's backup and recovery operations should also be determined.   Many firms fail to balance costs and risk to meet the most effective options.    For example, businesses may pay too much for backup and recovery simply because they are using snapshot technologies and off-host backup for data sets that are not the product of high-volume, mission-critical applications and, therefore, do not warrant such stringent and expensive technology.

Another critical focus area is the development of sound process and procedures in the management of day-to-day operations.   A documented set of operation procedures focused on incident management, root cause analysis, and change management are essential in order to effectively execute the strategy.    Many companies fail to perform root cause analysis, and thus repeat the same procedural mistakes, making it virtually impossible to optimize a backup and recovery strategy.

Establishing service level agreements to define the expected level of service, and then routinely proactively managing and analyzing backup logs to determine whether jobs are completed successfully are also steps toward more effective backup and recovery. And, when backups fail, thorough incident management should be applied to discover and resolve issues in a timely manner.

Rapid Recovery

Without the ability to recover data, backing up data is useless. As a result, organizations must not only devise and follow a recovery plan, but also make sure it works.

A good disaster recovery plan takes into consideration all the stages of recovery and all the components required. It addresses the need not only for data recovery but also the availability of skills and resources in order to resolve any problem that may occur.

Once a disaster recovery plan is created, it should be documented along with recovery procedures, and tested-not just once, but on a regular basis. By testing and re-testing recovery processes and systems, organizations can uncover and remediate any flaws in their backup and recovery strategy that would leave them vulnerable to the very information loss and downtime that such a plan was designed to avoid.

Backup and Recovery Services

Of course, while managing backup and recovery in-house may be a viable strategy for many enterprises, other organizations may prefer to outsource these operations to a trusted third party. After all, working with a reliable outsourcer alleviates many of the concerns and challenges of undertaking a comprehensive backup and recovery strategy in-house, including areas such as staffing, compliance, service levels and cost.

For example, while the total cost of ownership of an in-house solution may be difficult to measure, an outsourcer charges a fixed monthly fee, providing organizations the assurance of a known, consistent cost. Likewise, while it is often difficult to establish, adhere to, or measure operational service levels internally, an outsourcer that sets strict SLAs guarantees performance. Also, an outsourcer with in-depth experience working out recovery time objectives and recovery point objectives ensures that organizations are better positioned to recover the right data in the right timeframe.

Organizations may also prefer to work with a third-party to leverage the outsourcer's expertise in backup and recovery processes and technologies. An experienced service provider can also help the organization understand and address complex compliance demands as well as aid in responding to audits. In addition, a third party outsource provider can also leverage comprehensive reporting capabilities to bring a more structured approach to protecting critical business data.

By addressing so many critical needs so efficiently, an outsource provider may ultimately enable an organization to focus more on core competencies and deploy critical resources to more strategic business initiatives and projects. The most effective providers are those that offer tailored services, operate according to agreed SLAs, and have a proven record of delivering on their promises and a reputation for ensuring cost-effectiveness.

Whether opting to manage backup and recovery in-house or outsource it to a third party, organizations that make these critical services a business priority will be better positioned to respond effectively in the wake of a disaster or failure. By replacing a reactive attitude toward backup and recovery with a proactive strategy, organizations will significantly mitigate the risk of data loss and business disruption while addressing regulatory and legal demands with efficiency and thrift.

What to Look For in an Outsourced Provider

Once an organization decides to work with a provider to manage backup and recovery operations, they should ask a few simple questions to help secure a beneficial partnership:

  • What service levels and metrics do they offer for their backup and recovery service?
  • Do they have a documented set of procedures focused on areas such as incident management, problem management, and change management?
  • What is the experience level of their engineers related to the product?
  • What type of training programs are provided to engineers?
  • What automation have they deployed in the delivery of the service around monitoring and reporting?

More Stories By Bill Watson

Bill Watson is director of product management for Symantec’s Managed Backup Services. Prior to his current position, Bill held various roles within Symantec’s global services and enterprise technical support organizations, as well as positions with AT&T and World Commerce Online, a business to business e-commerce company.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@ThingsExpo Stories
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
We’re no longer looking to the future for the IoT wave. It’s no longer a distant dream but a reality that has arrived. It’s now time to make sure the industry is in alignment to meet the IoT growing pains – cooperate and collaborate as well as innovate. In his session at @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, will examine the key ingredients to IoT success and identify solutions to challenges the industry is facing. The deep industry expertise behind this presentation will provide attendees with a leading edge view of rapidly emerging IoT oppor...
In this Women in Technology Power Panel at 15th Cloud Expo, moderated by Anne Plese, Senior Consultant, Cloud Product Marketing at Verizon Enterprise, Esmeralda Swartz, CMO at MetraTech; Evelyn de Souza, Data Privacy and Compliance Strategy Leader at Cisco Systems; Seema Jethani, Director of Product Management at Basho Technologies; Victoria Livschitz, CEO of Qubell Inc.; Anne Hungate, Senior Director of Software Quality at DIRECTV, discussed what path they took to find their spot within the technology industry and how do they see opportunities for other women in their area of expertise.
DevOps Summit 2015 New York, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential.
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
Wearable devices have come of age. The primary applications of wearables so far have been "the Quantified Self" or the tracking of one's fitness and health status. We propose the evolution of wearables into social and emotional communication devices. Our BE(tm) sensor uses light to visualize the skin conductance response. Our sensors are very inexpensive and can be massively distributed to audiences or groups of any size, in order to gauge reactions to performances, video, or any kind of presentation. In her session at @ThingsExpo, Jocelyn Scheirer, CEO & Founder of Bionolux, will discuss ho...
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics is e...

ARMONK, N.Y., Nov. 20, 2014 /PRNewswire/ --  IBM (NYSE: IBM) today announced that it is bringing a greater level of control, security and flexibility to cloud-based application development and delivery with a single-tenant version of Bluemix, IBM's platform-as-a-service. The new platform enables developers to build ap...

Building low-cost wearable devices can enhance the quality of our lives. In his session at Internet of @ThingsExpo, Sai Yamanoor, Embedded Software Engineer at Altschool, provided an example of putting together a small keychain within a $50 budget that educates the user about the air quality in their surroundings. He also provided examples such as building a wearable device that provides transit or recreational information. He then reviewed the resources available to build wearable devices at home including open source hardware, the raw materials required and the options available to power s...
The Internet of Things promises to transform businesses (and lives), but navigating the business and technical path to success can be difficult to understand. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, demonstrated how to approach creating broadly successful connected customer solutions using real world business transformation studies including New England BioLabs and more.
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use cases.
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can't be addressed w...
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What's driving t...
The Internet of Things is a misnomer. That implies that everything is on the Internet, and that simply should not be - especially for things that are blurring the line between medical devices that stimulate like a pacemaker and quantified self-sensors like a pedometer or pulse tracker. The mesh of things that we manage must be segmented into zones of trust for sensing data, transmitting data, receiving command and control administrative changes, and peer-to-peer mesh messaging. In his session at @ThingsExpo, Ryan Bagnulo, Solution Architect / Software Engineer at SOA Software, focused on desi...
"For over 25 years we have been working with a lot of enterprise customers and we have seen how companies create applications. And now that we have moved to cloud computing, mobile, social and the Internet of Things, we see that the market needs a new way of creating applications," stated Jesse Shiah, CEO, President and Co-Founder of AgilePoint Inc., in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have s...