Microsoft Cloud Authors: Jim Kaskade, Lori MacVittie, Janakiram MSV, Andreas Grabner, Pat Romanski

Related Topics: @BigDataExpo, Microservices Expo, Containers Expo Blog, @CloudExpo, Government Cloud, SDN Journal

@BigDataExpo: Blog Post

Public Sector Big Data: Five Ways Big Data Must Evolve in 2013

2012 will go down as a “Big” year for Big Data in the public sector


Editor’s note: This guest post provides context on mission focused data analytics in the federal space by one of the leaders of the federal big data movement, Ray Muslimani. -bg

2012 will go down as a “Big” year for Big Data in the public sector. Rhetoric and hype has been followed by tangible action on the part of both government and industry. The $200 million Big Data initiative unveiled by the White House in March 2012 was an injection of R&D and credibility towards efforts to develop tools and technologies to help solve the nation’s most pressing challenges.

On the industry side, the recently issued TechAmerica report, “Demystifying Big Data,” provides agencies with a roadmap for using Big Data to better serve citizens. It also offers a set of policy recommendations and practical steps agencies can take to get started with Big Data initiatives.

For all of the enthusiasm around Big Data this year, every indication is that 2013 will be the year when Big Data transforms the business of government. Below are 5 steps that need to be taken in order for Big Data to evolve in 2013 and deliver on its promise.

Demystify Big Data
Government agencies warmed to the potential of Big Data throughout 2012, but more education is required to help decision makers wade through their options and how further investments can be justified. Removing the ambiguitiessurrounding Big Data requires an emphasis in 2013 on education from both industry and government.

The TechAmerica Big Data report is a good example of how industry can play an active role in guiding agencies through Big Data initiatives. It also underscores that vendors can’t generate more Big Data RFPs through marketing slicks and sales tactics alone. This approach will not demystify Big Data – it will simply seed further doubt if providers of Big Data tools and solutions focus only on poking holes in competitor alternatives.

Industry and government should follow proven templates for education in 2013. For example, agencies can arrange “Big Data Days” in a similar format as Industry Tech Days occur today. Big Data industry days can help IT providers gain better insight into how each Agency plans to approach their Big Data challenges in 2013 and offer these agencies an opportunity to see a wide range of Big Data services.

The Big Data education process must also extend to contracting officers. Agencies need guidance on how RFPs can be constructed to address a service-based model.

Consumerize Big Data
While those within the public sector with the proper training and skills to analyze data have benefited from advanced Big Data tools, it has been far more difficult for everyday business users and decision makers to access the data in a useful way. Sluggish data query responses, data quality issues, and a clunky user experience is undermining the benefits Big Data Analytics can deliver and requiring users to be de facto “data scientists” to make sense of it all.

Supporting this challenge is a 2012 MeriTalk survey, “The Big Data Gap,” that finds just 60 percent of IT professionals indicate their agency is analyzing the data it collects and a modest 40 percent are using data to make strategic decisions. All of this despite the fact that 96 percent of those surveyed expects their agency’s stored data to grow in the next two years by an average of 64 percent.  The gap here suggests a struggle for non “data scientists” to convert data into business decisions. 

What if any government user could ask a question in natural language and receive the answer in a relevant visualization?  For Big Data to evolve in 2013 we must consumerize the user experience by removing spreadsheets and reports, and place the power of analytics in the hands of users of any level without analytics expertise.

Mobilize Big Data
IDC Government Insights predicts that in 2013, 35 percent of new Federal and state applications will be mobile. At the same time, 65 percent of Federal IT executives expect mobile device use to increase by 20 percent in 2013, according to The 2012-2013 Telework/Mobile IT Almanac.

Part of consumerizing Big Data means building it for any device so that users do not need to be tethered to their desktops to analyze data. Agency decision makers must be empowered to easily view and analyze data on tablets and smartphones, while the increase of teleworking in the public sector requires Big Data to be accessible from anywhere, at any time, and on any device.

There is promising innovation at work by both established Federal IT providers and upstarts in taking a mobile-first path to Big Data, rather than the traditional approach of building BI dashboards for the desktop. The degree to which 2013 sees a shift in Big Data from the desktop to tablets and smartphones will depend on how forcefully solutions providers employ a mobile-first approach to Big Data.

Act on Big Data
A tremendous amount of “thought” energy went into Big Data in 2012. For Big Data to evolve in a meaningful way in 2013, initiatives and studies must generate more action in the form of Big Data RFIs and RFPs.

Within the tight budget climate, agencies will not act on Big Data if vendor proposals require massive investments in IT infrastructure and staffing. There must be a shift –to the extent possible – of the financial and resource burden from agency to vendor. For example, some vendors have developed “Big Data Clouds” that allow agencies to leverage a secure, scalable framework for storing and managing data, along with a toolset for performing consumer-grade search and analysis on that data.

Open Big Data
Adoption of Big Data solutions has been accelerated by open source tools such as Hadoop, MapReduce, Hive, and HBase. While some agencies will find it tempting to withdraw to the comfort of proprietary Big Data tools that they can control in closed systems, that path undermines the value Big Data can ultimately deliver.

One could argue that as open source goes in 2013, Big Data goes as well. If open source platforms and tools continue to address agency demands for security, scalability, and flexibility, benefits within from Big Data within and across agencies will increase exponentially. There are hundreds of thousands of viable open source technologies on the market today. Not all are suitable for agency requirements, but as agencies update and expand their uses of data, these tools offer limitless opportunities to innovate. Additionally, opting for open source instead of proprietary vendor solutions prevents an agency from being locked into a single vendor’s tool that it may at some point outgrow or find ill suited for their needs.

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley writes on enterprise IT. He is a founder and partner at Cognitio Corp and publsher of CTOvision.com

@ThingsExpo Stories
If you had a chance to enter on the ground level of the largest e-commerce market in the world – would you? China is the world’s most populated country with the second largest economy and the world’s fastest growing market. It is estimated that by 2018 the Chinese market will be reaching over $30 billion in gaming revenue alone. Admittedly for a foreign company, doing business in China can be challenging. Often changing laws, administrative regulations and the often inscrutable Chinese Interne...
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
SYS-CON Events announced today that Streamlyzer will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Streamlyzer is a powerful analytics for video streaming service that enables video streaming providers to monitor and analyze QoE (Quality-of-Experience) from end-user devices in real time.
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
SYS-CON Events announced today that Pulzze Systems will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Pulzze Systems, Inc. provides infrastructure products for the Internet of Things to enable any connected device and system to carry out matched operations without programming. For more information, visit http://www.pulzzesystems.com.
One of biggest questions about Big Data is “How do we harness all that information for business use quickly and effectively?” Geographic Information Systems (GIS) or spatial technology is about more than making maps, but adding critical context and meaning to data of all types, coming from all different channels – even sensors. In his session at @ThingsExpo, William (Bill) Meehan, director of utility solutions for Esri, will take a closer look at the current state of spatial technology and ar...
Cloud based infrastructure deployment is becoming more and more appealing to customers, from Fortune 500 companies to SMEs due to its pay-as-you-go model. Enterprise storage vendors are able to reach out to these customers by integrating in cloud based deployments; this needs adaptability and interoperability of the products confirming to cloud standards such as OpenStack, CloudStack, or Azure. As compared to off the shelf commodity storage, enterprise storages by its reliability, high-availabil...
The IoT industry is now at a crossroads, between the fast-paced innovation of technologies and the pending mass adoption by global enterprises. The complexity of combining rapidly evolving technologies and the need to establish practices for market acceleration pose a strong challenge to global enterprises as well as IoT vendors. In his session at @ThingsExpo, Clark Smith, senior product manager for Numerex, will discuss how Numerex, as an experienced, established IoT provider, has embraced a ...
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
In the next forty months – just over three years – businesses will undergo extraordinary changes. The exponential growth of digitization and machine learning will see a step function change in how businesses create value, satisfy customers, and outperform their competition. In the next forty months companies will take the actions that will see them get to the next level of the game called Capitalism. Or they won’t – game over. The winners of today and tomorrow think differently, follow different...
“Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. CloudBerry Backup is a leading cross-platform cloud backup and disaster recovery solution integrated with major public cloud services, such as Amazon Web Services, Microsoft Azure and Google Cloud Platform.
The Internet of Things (IoT), in all its myriad manifestations, has great potential. Much of that potential comes from the evolving data management and analytic (DMA) technologies and processes that allow us to gain insight from all of the IoT data that can be generated and gathered. This potential may never be met as those data sets are tied to specific industry verticals and single markets, with no clear way to use IoT data and sensor analytics to fulfill the hype being given the IoT today.
Ask someone to architect an Internet of Things (IoT) solution and you are guaranteed to see a reference to the cloud. This would lead you to believe that IoT requires the cloud to exist. However, there are many IoT use cases where the cloud is not feasible or desirable. In his session at @ThingsExpo, Dave McCarthy, Director of Products at Bsquare Corporation, will discuss the strategies that exist to extend intelligence directly to IoT devices and sensors, freeing them from the constraints of ...
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...
Fifty billion connected devices and still no winning protocols standards. HTTP, WebSockets, MQTT, and CoAP seem to be leading in the IoT protocol race at the moment but many more protocols are getting introduced on a regular basis. Each protocol has its pros and cons depending on the nature of the communications. Does there really need to be only one protocol to rule them all? Of course not. In his session at @ThingsExpo, Chris Matthieu, co-founder and CTO of Octoblu, walk you through how Oct...
A completely new computing platform is on the horizon. They’re called Microservers by some, ARM Servers by others, and sometimes even ARM-based Servers. No matter what you call them, Microservers will have a huge impact on the data center and on server computing in general. Although few people are familiar with Microservers today, their impact will be felt very soon. This is a new category of computing platform that is available today and is predicted to have triple-digit growth rates for some ...
Everyone knows that truly innovative companies learn as they go along, pushing boundaries in response to market changes and demands. What's more of a mystery is how to balance innovation on a fresh platform built from scratch with the legacy tech stack, product suite and customers that continue to serve as the business' foundation. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue an...
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.