Click here to close now.


Microsoft Cloud Authors: Jordan Sanders, Carmen Gonzalez, Pat Romanski, Keith Mayer, Jayaram Krishnaswamy

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Open Source Cloud, Containers Expo Blog, Apache

@CloudExpo: Article

Nutanix Fields Next-Gen Software-Defined Data Center Widgetry

Nutanix claims to be the first to deliver RAID, high availability, snapshots and clones at the VM-level

Nutanix, a cloud hardware start-up that's offering a hybrid scale-out compute-cum-storage appliance backed by $72 million in VC funding only half of which is reportedly spent, has put out next-generation software-defined data center products.

It's updating its server hardware and its software to deal with divergent workloads. It's going to a quad-node box made by Quanta and should be able to support 400 VMs per chassis, up from 300.

It's got VM-centric disaster recovery, adaptive compression and a new highly configurable hardware platform. The widgetry includes Nutanix OS 3.0 and NX-3000 series hardware. It's supposed to help enterprises build next-generation software-defined data centers.

Besides VM-level disaster recovery and adaptive post-process compression, Nutanix OS 3.0 delivers dynamic cluster expansion, rolling software upgrades and support for KVM, its second hypervisor after VMware's.

Its software enhancements, coupled with the configurable NX-3000 series platform, enable flexibility, performance and scalability in enterprise data centers.

With NX-3000, Nutanix delivers a configurable platform in which compute- and storage-heavy nodes co-exist in a single heterogeneous cluster. It includes hardware models that vary in capacity and the number of PCIe-SSDs, SATA SSDs and SATA DDs server nodes.

The nodes can have different CPU cores per socket and variable memory capacities. This allows for independent scaling of compute and storage in a single system that's optimized for every use case and can scale to address evolving business requirements.

The Scale-Out Converged Storage (SOCS) virtual disk controllers that make the Nutanix server cluster into a SAN so compute and storage are on the same cluster and the compute jobs are close to the storage. Nutanix uses Flash

The NX-3000 uses Intel's Sandy Bridge chips - the eight-core E5-2660 processors running at 2.2GHz - and delivers VM density in a 2U form factor.

Nutanix claims to be the first to deliver RAID, high availability, snapshots and clones at the VM-level.

It says it's implemented a highly differentiated VM-centric disaster recovery engine.

The new Nutanix OS 3.0 includes native storage-optimized disaster recovery that enables multi-way, master-master replication supposedly never seen before in traditional storage arrays.

Administrators can configure disaster recovery policies that specify protection domains and consistency groups in primary sites, which can then be replicated to any combination of secondary sites to ensure maximum business resiliency and application performance. And any Nutanix cluster can serve as both a primary and secondary site simultaneously for different protection domains, providing even more flexibility and choice.

Nutanix OS 3.0 is supposed to deliver best-in-class runbook (failover and failback) automation that's hypervisor-agnostic, which means native disaster recovery capabilities are available and consistent regardless of the underlying virtualization platform or management tools.

One of the pillars of the Nutanix solution is a highly efficient MapReduce-based framework that implements information lifecycle management in the cluster to achieve tiering, disk rebuilding and cluster rebalancing.

It's supposedly the first of its kind in the storage industry.

The same framework is being leveraged to deliver adaptive post-process compression of cold data as it migrates to the lower data tiers, so as not to impact the normal IO path.

By leveraging the information lifecycle management capabilities inherent in Nutanix' software, the system dynamically determines which data blocks to compress based on how frequently they're being accessed by the VMs.

Post-process compression is ideal for random or batch workloads and delivers the highest possible overall performance. In addition, Nutanix' OS 3.0 supports basic in-line compression that works as the data is being written, which is better suited for archival and sequential workloads.

The company says, "While our existing storage solutions support compression in general, the granularity of Nutanix compression allows us to set policies at the VM level, ensuring maximum business value and storage utilization,"

With Nutanix OS 3.0, the company is supposed to deliver on its commitment to bring all of its enterprise features to the broadest range of platforms in the industry.

The software, which was designed to be hypervisor-agnostic, will now support KVM and VMware vSphere 5.1.

Regardless of the underlying virtualization platform or management framework, enterprises benefit from all of the capabilities of the Nutanix software.

The KVM hypervisor provides financial flexibility for enterprises and works well in workloads such as Hadoop.

Nutanix OS 3.0 also uses a discovery-based protocol to auto-detect new nodes added to the same network as a cluster, enabling administrators to quickly and easily expand a cluster without incurring any downtime.

In the background, the system will then rebalance the data across the entire storage pool, including the newly added nodes, to provide maximum I/O performance.

The new software also uses software-defined networking tricks to achieve rolling software upgrades in the always-on cluster. Upgrades are delivered in a peer-to-peer framework to enable rapid software upgrades while retaining maximum cluster availability.

The features and capabilities delivered in Nutanix OS 3.0 and NX-3000 are supposed to usher in a new era of business resiliency and data center optimization.

The start-up thinks it's displaced $25 million in server and SAN storage sales and is close to doubling sales every quarter. Its co-founder and CEO Dheeraj Pandey built the first Exadata clusters at Oracle. Co-founder Mohit Aron was chief architect at Aster Data and lead designer of the Google File System that led to Hadoop.

More Stories By Maureen O'Gara

Maureen O'Gara the most read technology reporter for the past 20 years, is the Cloud Computing and Virtualization News Desk editor of SYS-CON Media. She is the publisher of famous "Billygrams" and the editor-in-chief of "Client/Server News" for more than a decade. One of the most respected technology reporters in the business, Maureen can be reached by email at maureen(at) or paperboy(at), and by phone at 516 759-7025. Twitter: @MaureenOGara

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@ThingsExpo Stories
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical to maintaining positive ROI. Raxak Protect is an automated security compliance SaaS platform and ma...
We all know that data growth is exploding and storage budgets are shrinking. Instead of showing you charts on about how much data there is, in his General Session at 17th Cloud Expo, Scott Cleland, Senior Director of Product Marketing at HGST, showed how to capture all of your data in one place. After you have your data under control, you can then analyze it in one place, saving time and resources.
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi’s VP Business Development and Engineering, explored the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context with p...
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
Internet of @ThingsExpo, taking place June 7-9, 2016 at Javits Center, New York City and Nov 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 18th International @CloudExpo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo New York Call for Papers is now open.
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York and Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound cha...
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, rich desktop and tuned mobile experiences can now be created with a single codebase – without compromising functionality, performance or usability. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, demonstrated examples of com...
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningful and actionable insights. In his session at @ThingsExpo, Paul Turner, Chief Marketing Officer at...
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
In his General Session at 17th Cloud Expo, Bruce Swann, Senior Product Marketing Manager for Adobe Campaign, explored the key ingredients of cross-channel marketing in a digital world. Learn how the Adobe Marketing Cloud can help marketers embrace opportunities for personalized, relevant and real-time customer engagement across offline (direct mail, point of sale, call center) and digital (email, website, SMS, mobile apps, social networks, connected objects).
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, exploreed the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessions, I wanted to share some of my observations on emerging trends. As cyber security serves as a fou...
Continuous processes around the development and deployment of applications are both impacted by -- and a benefit to -- the Internet of Things trend. To help better understand the relationship between DevOps and a plethora of new end-devices and data please welcome Gary Gruver, consultant, author and a former IT executive who has led many large-scale IT transformation projects, and John Jeremiah, Technology Evangelist at Hewlett Packard Enterprise (HPE), on Twitter at @j_jeremiah. The discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now all corporate assets – people, objects, and spaces – can share information about themselves and thei...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound effect on the world, and what should we expect to see over the next couple of years.
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true change and transformation possible.
PubNub has announced the release of BLOCKS, a set of customizable microservices that give developers a simple way to add code and deploy features for realtime apps.PubNub BLOCKS executes business logic directly on the data streaming through PubNub’s network without splitting it off to an intermediary server controlled by the customer. This revolutionary approach streamlines app development, reduces endpoint-to-endpoint latency, and allows apps to better leverage the enormous scalability of PubNub’s Data Stream Network.
The cloud. Like a comic book superhero, there seems to be no problem it can’t fix or cost it can’t slash. Yet making the transition is not always easy and production environments are still largely on premise. Taking some practical and sensible steps to reduce risk can also help provide a basis for a successful cloud transition. A plethora of surveys from the likes of IDG and Gartner show that more than 70 percent of enterprises have deployed at least one or more cloud application or workload. Yet a closer inspection at the data reveals less than half of these cloud projects involve production...