|By Tom Leyden||
|November 8, 2012 09:00 AM EST||
It’s probably a good idea to state I wrote this blog while employed by Amplidata, but during my own time. This article reflects my own opinion, not necessarily that of Amplidata or its partners.
As I am writing this, I am crossing the Atlantic for the seventh time in about two months. I’m on my way to CloudExpo West in Santa Clara, one of the few technology trade shows that are still growing. At the event I will be sitting on the last Object Storage for Big Data panel of the season. Robin Harris – aka StorageMojo – and I have been working hard this fall educating the industry on the benefits, challenges and opportunities of Object Storage. We’ve been trying to explain how the current generation of Object Storage platforms is so much different from the first attempt at it (EMC’s Centera), how it enables companies cope with the massive amounts of unstructured data that we are all generating and how companies can even monetize archived data by re-activating their archives.
Unlike StorageMojo and some other people who I have been working with lately, I don’t have decades of experience in the storage industry. However, being located in Belgium, I’ve had the privilege of working with people who used to be part of the Filepool team (and spent years at EMC after the acquisition). Those were the earliest object storage days, I had no idea of what was coming. Later, at Sun, I learned a lot about Object Storage when we were working on the Sun Cloud project. The architecture (ZFS) was different of what we are seeing on the market today, but the concept was – as was often the case at Sun – promising. This article is not another take at describing Object Storage and the benefits it brings, it’s more an overview of what we have learned at the past four Object Storage for Big Data panels. The setup for each of the panels was mostly the same: Robin Harris would challenge between 4 and 6 Object Storage specialists (technology vendors or users) and try to have the audience participate with. We did expect the topics of the panels to be different as we were hosted by trade shows with different audiences, but we never expected the discussions to vary as much as they did.
The common thread for each panel was the challenge companies have to store different types of Big Data and more particularly Big Unstructured Data. The latter represents up to 90% of the digital data that we will be generating over the next decades and will put traditional storage technologies under heavy stress as they are hitting their scalability limits. Unstructured data is currently mostly stored in file system based storage infrastructures. File systems will not only be unable to scale as required – try setting up a file structure for 5 petabytes of data – but they will also become obsolete as applications can provide a lot more features to keep your unstructured data organized (structured?), to analyze that information and potentially monetize what is today stored in (dead) tape archives. Rich applications that talk directly to a large and (infinitely) scalable storage pool make a lot more sense than maintenance-intensive files systems. Also, properly designed Object Storage (with erasure coding technology instead of RAID to protect the data) requires a lot less overhead, consumes a lot less power, can easily be implemented over multiple sites and does not require migration to new systems when a system cannot be further scaled. So what else did we discuss at the panels?
The first panel after summer was at Intel’s IDF in San Francisco. Panel members came from Intel and Quanta, who with Amplidata built an Object Storage reference architecture. We also had Michelle Munson of Aspera, who presented a couple of perfect use cases of Object Storage in the media and entertainment industry. Aspera developed a very smart way to transfer large amounts of data over the WAN in a much more efficient way than how it is currently done. Aspera’s bandwidth optimization software practically enables this new generation of Object Storage by taking away the latency issue, e.g. to stream high res movies over a long distance. Once we had explained the drivers for Object Storage, the opportunities and best practices, most of the discussion (questions from the audience) was about why RAID is not the right technology to architect an Object Storage platform with. We discussed the benefits of erasure coding in much detail and spent a lot of time on the differences with RAID. In short: in Erasure Coding based systems, all disks are equal (all parity) and there is no need to rebuild a disk when broken: when codes are lost due to bit errors or hardware failures, new codes can be generated spread over the whole pool, not just one system. A recent and very good independent deepdive in the Amplidata erasure coding technology can be found here.
A lot less RAID and erasure coding at the Createasphere DAM Show in New York a few weeks later. The show focusses on Digital Asset Management and the attendees are more interested in the applications and content than the actual data. That did not make the discussion any less interesting. From Sarah Berndt of Johnson Space Center we learned a *lot* about the importance of metadata, an issue that would be discussed at SNW Europe as well (see further). Interesting newcomer on the panel was Dalet, a DAM vendor who integrate with many Object Storage platforms and see a clear benefit of having their platform interface with a scale-out storage pool directly (REST) rather than through an additional file system. Dalet is the perfect valet in my car analogy that is becoming more and more popular: a file system is like a public parking lot where you have to go find your car yourself (this once took me a few hours in Paris’ CDG airport). Object storage is much more like valet parking, where you get a ticket when you leave your car and use that ticket to get it back later. The application, Dalet, is the valet.
At SNWUSA in Santa Clara in October we had David Chapa of Quantum on board for the firs time. David is an authority to explain the use cases where tape is the better alternative and when it is better to use Object Storage, or Wide Area Storage (WAS) as Quantum calls it. WAS is Quantum’s attempt to take away the confusion caused by the name Object Storage, a term first used by EMC almost a decade ago. I think it’s a good idea of Quantum to try to introduce a new term, I’m not sure WAS is the best choice though. Maybe something new will come up next month at Greg Duplessie’s Object Storage summit, although I doubt it. Once we kind of agreed that this generation of Object Storage, or whatever it will be called later, has very little or nothing to do with EMC’s product line that was most famous for locking-in customers, the conversation took a very sudden change. In an attempt to spice up the discussion, Ranajit Nevatia of Panzura claimed Object Storage provides very bad performance. This was very much true for the first generation of Object Storage platforms we just discussed and might be true of the platforms they currently promote (including Atmos, EMC’s second attempt at Object Storage), but not at all for the technologies that are most successful on the market today. Scality have been promoting their high IOPS (smaller files, IO intensive workloads). Amplidata focus more on large file storage, which is IMO the more obviouse use case for Object Storage, but I may be biassed. In a recent independent test, Amplidata demonstrated throughout numbers that can only be called “extremely high-performant”. Howard Marks confirmed Amplidata provides 1 GB/s of throughput with a single controller. But it gets better: Amplidatas scale throughput linearly by adding more controllers. So a system with 6 controllers provides 6 GB/s of throughput.
Last week’s panel at SNW Europe, which is traditionally well attended by press and analysts, was again very interactive. Robin Harris set the stage explaining how this generation of Object Storage is different from earlier products. This led to a lengthy discussion about API’s, a call for one standard API (I say let’s just all standardize on Amazon) and complaints about lock-ins by … yes, EMC. Vendors be warned, that trick is getting old and is not getting any respect. The audience included some of the better analysts and bloggers, including the451′s Simon Robinson and Storagebod. The latter, known for being a critic of the Object Storage paradigm (with great arguments), helped us bring the discussion to the next level by bringing up interesting topics such as the importance of metadata for the applications: who/what will enter metadata? The application? People? The panel acknowledged that, while applications already generate quite some metadata, companies will have to make business decisions on how much metadata they need. Adding more metadata comes at a cost as it will require manual work. The day after the panel, it was interesting to see Chris Mellor be critical of Object Storage in his review of the show (how dare the Object Storage vendors doubt the many benefits of tape?). Chris, join us on the panel next time!
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
Jul. 25, 2016 08:30 PM EDT Reads: 1,923
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
Jul. 25, 2016 08:00 PM EDT Reads: 1,955
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
Jul. 25, 2016 07:30 PM EDT Reads: 1,878
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 25, 2016 07:15 PM EDT Reads: 1,791
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
Jul. 25, 2016 07:15 PM EDT Reads: 986
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
Jul. 25, 2016 06:30 PM EDT Reads: 959
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Jul. 25, 2016 06:15 PM EDT Reads: 1,742
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
Jul. 25, 2016 06:00 PM EDT Reads: 2,412
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
Jul. 25, 2016 06:00 PM EDT Reads: 2,075
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Jul. 25, 2016 03:45 PM EDT Reads: 952
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Jul. 25, 2016 03:30 PM EDT Reads: 1,691
"delaPlex is a software development company. We do team-based outsourcing development," explained Mark Rivers, COO and Co-founder of delaPlex Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 25, 2016 03:00 PM EDT Reads: 1,971
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
Jul. 25, 2016 02:45 PM EDT Reads: 866
Is your aging software platform suffering from technical debt while the market changes and demands new solutions at a faster clip? It’s a bold move, but you might consider walking away from your core platform and starting fresh. ReadyTalk did exactly that. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue and over a decade of audio conferencing product development to start an innovati...
Jul. 25, 2016 02:00 PM EDT Reads: 967
Early adopters of IoT viewed it mainly as a different term for machine-to-machine connectivity or M2M. This is understandable since a prerequisite for any IoT solution is the ability to collect and aggregate device data, which is most often presented in a dashboard. The problem is that viewing data in a dashboard requires a human to interpret the results and take manual action, which doesn’t scale to the needs of IoT.
Jul. 25, 2016 01:00 PM EDT Reads: 1,933
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Jul. 25, 2016 12:15 PM EDT Reads: 496
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
Jul. 25, 2016 12:00 PM EDT Reads: 1,824
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
Jul. 25, 2016 11:00 AM EDT Reads: 918
Verizon Communications Inc. (NYSE, Nasdaq: VZ) and Yahoo! Inc. (Nasdaq: YHOO) have entered into a definitive agreement under which Verizon will acquire Yahoo's operating business for approximately $4.83 billion in cash, subject to customary closing adjustments. Yahoo informs, connects and entertains a global audience of more than 1 billion monthly active users** -- including 600 million monthly active mobile users*** through its search, communications and digital content products. Yahoo also co...
Jul. 25, 2016 10:30 AM EDT Reads: 374
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
Jul. 25, 2016 10:30 AM EDT Reads: 1,953