|By Bill Watson||
|January 16, 2009 07:00 AM EST||
The inability to recover mission-critical information promptly can spell disaster for a company in today's 24x7 business environment. IDC estimates that server downtime cost organizations roughly $140 billion worldwide in lost worker productivity and revenue in 2007.
Much harder to quantify yet equally important to consider are the opportunity costs associated with failed and delayed recovery of vital applications, databases, and servers. Stringent business requirements and reduced budgets continue to place significant demands on the recovery of data.
Yet, like the property owner who only realizes the importance of homeowner insurance in the wake of a destructive fire, organizations often recognize the value of backup and recovery to the business only after an outage that interrupts operations and results in lost data.
Clearly, this reactive approach must be replaced with a proactive strategy that attaches much greater significance to the role of backup and recovery. By understanding and addressing common backup and recovery issues and implementing the most appropriate solution for their organization, companies can reduce the risk associated with unmet recovery objectives, data loss, and business disruption.
The most important step in maximizing the value of a backup and recovery operation is the development of a backup inventory plan. This step is meant to catalogue all data in the enterprise and to determine the importance to the organization both in speed of recoverability and importance in ensuring it can be restored effectively. After all, impressive backup success rates are of little consequence unless the right data is being backed up. As a result, organizations must be sure that they are preserving their most essential data, such as a customer database that is accessed and modified frequently every day.
To make sure they are backing up the right data correctly, IT should develop a tiered list of backup and recovery requirements that is aligned with the company's business needs. Once this service catalog is established, it should be reviewed and revisited regularly.
Once the service catalogue is completed, the respective costs of the organization's backup and recovery operations should also be determined. Many firms fail to balance costs and risk to meet the most effective options. For example, businesses may pay too much for backup and recovery simply because they are using snapshot technologies and off-host backup for data sets that are not the product of high-volume, mission-critical applications and, therefore, do not warrant such stringent and expensive technology.
Another critical focus area is the development of sound process and procedures in the management of day-to-day operations. A documented set of operation procedures focused on incident management, root cause analysis, and change management are essential in order to effectively execute the strategy. Many companies fail to perform root cause analysis, and thus repeat the same procedural mistakes, making it virtually impossible to optimize a backup and recovery strategy.
Establishing service level agreements to define the expected level of service, and then routinely proactively managing and analyzing backup logs to determine whether jobs are completed successfully are also steps toward more effective backup and recovery. And, when backups fail, thorough incident management should be applied to discover and resolve issues in a timely manner.
Without the ability to recover data, backing up data is useless. As a result, organizations must not only devise and follow a recovery plan, but also make sure it works.
A good disaster recovery plan takes into consideration all the stages of recovery and all the components required. It addresses the need not only for data recovery but also the availability of skills and resources in order to resolve any problem that may occur.
Once a disaster recovery plan is created, it should be documented along with recovery procedures, and tested-not just once, but on a regular basis. By testing and re-testing recovery processes and systems, organizations can uncover and remediate any flaws in their backup and recovery strategy that would leave them vulnerable to the very information loss and downtime that such a plan was designed to avoid.
Backup and Recovery Services
Of course, while managing backup and recovery in-house may be a viable strategy for many enterprises, other organizations may prefer to outsource these operations to a trusted third party. After all, working with a reliable outsourcer alleviates many of the concerns and challenges of undertaking a comprehensive backup and recovery strategy in-house, including areas such as staffing, compliance, service levels and cost.
For example, while the total cost of ownership of an in-house solution may be difficult to measure, an outsourcer charges a fixed monthly fee, providing organizations the assurance of a known, consistent cost. Likewise, while it is often difficult to establish, adhere to, or measure operational service levels internally, an outsourcer that sets strict SLAs guarantees performance. Also, an outsourcer with in-depth experience working out recovery time objectives and recovery point objectives ensures that organizations are better positioned to recover the right data in the right timeframe.
Organizations may also prefer to work with a third-party to leverage the outsourcer's expertise in backup and recovery processes and technologies. An experienced service provider can also help the organization understand and address complex compliance demands as well as aid in responding to audits. In addition, a third party outsource provider can also leverage comprehensive reporting capabilities to bring a more structured approach to protecting critical business data.
By addressing so many critical needs so efficiently, an outsource provider may ultimately enable an organization to focus more on core competencies and deploy critical resources to more strategic business initiatives and projects. The most effective providers are those that offer tailored services, operate according to agreed SLAs, and have a proven record of delivering on their promises and a reputation for ensuring cost-effectiveness.
Whether opting to manage backup and recovery in-house or outsource it to a third party, organizations that make these critical services a business priority will be better positioned to respond effectively in the wake of a disaster or failure. By replacing a reactive attitude toward backup and recovery with a proactive strategy, organizations will significantly mitigate the risk of data loss and business disruption while addressing regulatory and legal demands with efficiency and thrift.
What to Look For in an Outsourced Provider
Once an organization decides to work with a provider to manage backup and recovery operations, they should ask a few simple questions to help secure a beneficial partnership:
- What service levels and metrics do they offer for their backup and recovery service?
- Do they have a documented set of procedures focused on areas such as incident management, problem management, and change management?
- What is the experience level of their engineers related to the product?
- What type of training programs are provided to engineers?
- What automation have they deployed in the delivery of the service around monitoring and reporting?
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal an...
May. 24, 2015 03:00 AM EDT Reads: 2,747
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
May. 24, 2015 03:00 AM EDT Reads: 2,924
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happens, where data lives and where the interface lies. For instance, it's a mix of architectural styles ...
May. 24, 2015 03:00 AM EDT Reads: 5,700
The 3rd International @ThingsExpo, co-located with the 16th International Cloud Expo – to be held June 9-11, 2015, at the Javits Center in New York City, NY – is now accepting Hackathon proposals. Hackathon sponsorship benefits include general brand exposure and increasing engagement with the developer ecosystem. At Cloud Expo 2014 Silicon Valley, IBM held the Bluemix Developer Playground on November 5 and ElasticBox held the DevOps Hackathon on November 6. Both events took place on the expo floor. The Bluemix Developer Playground, for developers of all levels, highlighted the ease of use of...
May. 24, 2015 02:30 AM EDT Reads: 4,339
We’re no longer looking to the future for the IoT wave. It’s no longer a distant dream but a reality that has arrived. It’s now time to make sure the industry is in alignment to meet the IoT growing pains – cooperate and collaborate as well as innovate. In his session at @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, will examine the key ingredients to IoT success and identify solutions to challenges the industry is facing. The deep industry expertise behind this presentation will provide attendees with a leading edge view of rapidly emerging IoT oppor...
May. 24, 2015 02:30 AM EDT Reads: 4,981
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics is e...
May. 24, 2015 02:00 AM EDT Reads: 6,189
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core of our infrastructures. At the same time, we have old approaches made new again like micro-services...
May. 24, 2015 01:30 AM EDT Reads: 5,350
SYS-CON Events announced today that Gridstore™, the leader in hyper-converged infrastructure purpose-built to optimize Microsoft workloads, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Gridstore™ is the leader in hyper-converged infrastructure purpose-built for Microsoft workloads and designed to accelerate applications in virtualized environments. Gridstore’s hyper-converged infrastructure is the industry’s first all flash version of HyperConverged Appliances that include both compute and storag...
May. 24, 2015 01:15 AM EDT Reads: 6,207
For years, we’ve relied too heavily on individual network functions or simplistic cloud controllers. However, they are no longer enough for today’s modern cloud data center. Businesses need a comprehensive platform architecture in order to deliver a complete networking suite for IoT environment based on OpenStack. In his session at @ThingsExpo, Dhiraj Sehgal from PLUMgrid will discuss what a holistic networking solution should really entail, and how to build a complete platform that is scalable, secure, agile and automated.
May. 24, 2015 01:00 AM EDT Reads: 4,332
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
May. 24, 2015 01:00 AM EDT Reads: 5,242
Hadoop as a Service (as offered by handful of niche vendors now) is a cloud computing solution that makes medium and large-scale data processing accessible, easy, fast and inexpensive. In his session at Big Data Expo, Kumar Ramamurthy, Vice President and Chief Technologist, EIM & Big Data, at Virtusa, will discuss how this is achieved by eliminating the operational challenges of running Hadoop, so one can focus on business growth. The fragmented Hadoop distribution world and various PaaS solutions that provide a Hadoop flavor either make choices for customers very flexible in the name of opti...
May. 24, 2015 12:30 AM EDT Reads: 3,862
In the consumer IoT, everything is new, and the IT world of bits and bytes holds sway. But industrial and commercial realms encompass operational technology (OT) that has been around for 25 or 50 years. This grittier, pre-IP, more hands-on world has much to gain from Industrial IoT (IIoT) applications and principles. But adding sensors and wireless connectivity won’t work in environments that demand unwavering reliability and performance. In his session at @ThingsExpo, Ron Sege, CEO of Echelon, will discuss how as enterprise IT embraces other IoT-related technology trends, enterprises with i...
May. 24, 2015 12:00 AM EDT Reads: 4,334
Wearable devices have come of age. The primary applications of wearables so far have been "the Quantified Self" or the tracking of one's fitness and health status. We propose the evolution of wearables into social and emotional communication devices. Our BE(tm) sensor uses light to visualize the skin conductance response. Our sensors are very inexpensive and can be massively distributed to audiences or groups of any size, in order to gauge reactions to performances, video, or any kind of presentation. In her session at @ThingsExpo, Jocelyn Scheirer, CEO & Founder of Bionolux, will discuss ho...
May. 23, 2015 09:00 PM EDT Reads: 5,212
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impact.
May. 23, 2015 09:00 PM EDT Reads: 4,852
Every day we read jaw-dropping stats on the explosion of data. We allocate significant resources to harness and better understand it. We build businesses around it. But we’ve only just begun. For big payoffs in Big Data, CIOs are turning to cognitive computing. Cognitive computing’s ability to securely extract insights, understand natural language, and get smarter each time it’s used is the next, logical step for Big Data.
May. 23, 2015 08:00 PM EDT Reads: 2,144
We are reaching the end of the beginning with WebRTC, and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) i...
May. 23, 2015 07:00 PM EDT Reads: 4,295
The 4th International Internet of @ThingsExpo, co-located with the 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - announces that its Call for Papers is open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
May. 23, 2015 07:00 PM EDT Reads: 1,762
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises are using some form of XaaS – software, platform, and infrastructure as a service.
May. 23, 2015 05:00 PM EDT Reads: 2,468
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, data security and privacy.
May. 23, 2015 04:00 PM EDT Reads: 4,799
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
May. 23, 2015 02:00 PM EDT Reads: 6,437