Welcome!

Microsoft Cloud Authors: Pat Romanski, Andreas Grabner, Nick Basinger, Kevin Benedict, Liz McMillan

Blog Feed Post

What does “Real Time” in “Real Time Big Data” mean?

Lately there has been lots of noise about “Real Time” Big Data. Lots of companies that associate themselves with this term are generally in analytical space and to them it really means “low-latency” for analytical processing of data which is usually stored in some warehouse, like Hadoop Distributed File System (HDFS). To achieve this they usually create in-memory indexes over HDFS data which allows their customers run fast queries on it.

Although low latencies are very important, they only cover one side of what “Real Time” really means. The part that is usually not covered is how current the analyzed data is, and in case of HDFS it is as current as the last snapshot copied into it. The need for snapshotting comes from the fact that most businesses are still running on traditional RDBMS systems (with NoSql gaining momentum where it fits), and data has to be at some point migrated into HDFS in order to be processed. Such snapshotting is currently part of most Hadoop deployments and it usually happens once or twice a day.

So how can your business run in “Real Time” if most of the decisions are actually made based on yesterday’s data? The architecture needs to be augmented to work with LIVE data, the data which has just been changed or created – not the data that is days old. This is where In-Memory Data Grids (a.k.a. Distributed Partitioned Caches) come in.

By putting a Data Grid in front of HDFS we can store recent or more relevant state in memory which allows instant access and fast queries on it. When the data is properly partitioned, you can treat your whole Data Grid as one huge memory space – you can literally cache Terabytes of data in memory. But even in this case, the memory space is still limited and when the data becomes less relevant, or simply old, it should still be offloaded onto RDBMS, HDFS, or any other storage. However, with this architecture, businesses can now do processing of both, current and historic data – and this is very powerful. Now financial companies can quickly react to latest ticks in market data, gaming applications can react to latest player updates, businesses can analyze latest performance of ad campaigns, etc.

Here are some of the important benefits our customers get when deploying GridGain In-Memory Compute and Data Grid in the above architecture:

  • Partitioning of data
  • Real Time MapReduce
  • Integration between Compute and Data Grids and ability to collocate your computations with data (a.k.a. data affinity)
  • In-Memory data indexing and ability to run complex SQL queries, including ability to execute “SQL Joins” between different types of data
  • Native affinity-aware clients for Java, .NET, C++, Android (with Objective-C client right around the corner)
  • Tight integration with various storage systems, including any type of RDBMS, Hadoop HDFS, HBase, etc…

Note that in this architecture In-Memory Data and/or Compute Grids really complement warehouse technologies, like HDFS – one is for in-memory processing of current data and another is for processing of historic data.

Read the original blog entry...

More Stories By Thomas Krafft

Over 15 years of experience in marketing and demand creation, with strategies driving over $500 million in revenue for a variety of companies in several high-growth and competitive markets, including consumer software and web services, ecommerce, demand creation through web and search, big data, and now healthcare.

IoT & Smart Cities Stories
@CloudEXPO and @ExpoDX, two of the most influential technology events in the world, have hosted hundreds of sponsors and exhibitors since our launch 10 years ago. @CloudEXPO and @ExpoDX New York and Silicon Valley provide a full year of face-to-face marketing opportunities for your company. Each sponsorship and exhibit package comes with pre and post-show marketing programs. By sponsoring and exhibiting in New York and Silicon Valley, you reach a full complement of decision makers and buyers in ...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessio...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, compared the Jevons Paradox to modern-day enterprise IT, examin...
While the focus and objectives of IoT initiatives are many and diverse, they all share a few common attributes, and one of those is the network. Commonly, that network includes the Internet, over which there isn't any real control for performance and availability. Or is there? The current state of the art for Big Data analytics, as applied to network telemetry, offers new opportunities for improving and assuring operational integrity. In his session at @ThingsExpo, Jim Frey, Vice President of S...
Rodrigo Coutinho is part of OutSystems' founders' team and currently the Head of Product Design. He provides a cross-functional role where he supports Product Management in defining the positioning and direction of the Agile Platform, while at the same time promoting model-based development and new techniques to deliver applications in the cloud.
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
LogRocket helps product teams develop better experiences for users by recording videos of user sessions with logs and network data. It identifies UX problems and reveals the root cause of every bug. LogRocket presents impactful errors on a website, and how to reproduce it. With LogRocket, users can replay problems.
Data Theorem is a leading provider of modern application security. Its core mission is to analyze and secure any modern application anytime, anywhere. The Data Theorem Analyzer Engine continuously scans APIs and mobile applications in search of security flaws and data privacy gaps. Data Theorem products help organizations build safer applications that maximize data security and brand protection. The company has detected more than 300 million application eavesdropping incidents and currently secu...