Welcome!

Microsoft Cloud Authors: John Basso, Liz McMillan, Pat Romanski, Elizabeth White, Mihai Corbuleac

Blog Feed Post

How to Harden Your APIs by Andy Thurai

The market for APIs has experienced explosive growth in recent years, yet one of the major issues that providers still face is the protection and hardening of the APIs that they expose to users. In particular, when you are exposing APIs from a cloud based platform, this becomes very difficult to achieve given the various cloud provider constraints. In order to achieve this you would need a solution that can provide the hardening capabilities out of the box, but that still permits for customization of the granular settings to meet the solution need. Intel has such a solution and it has been well thought out. If this is something you desire this article might help you foresee the many uses and versatility.

Identify sensitive data and sensitivity of your API

The first step in protecting sensitive data is identifying it as such. This could be anything like PII, PHI and PCI data. Perform a complete analysis of your inbound and outbound data to your API, including all parameters, to figure this out.

Once identified, make sure only authorized people can access the data.

This will require solid identity, authentication, and authorization systems to be in place. These all can be provided by the same system. Your API should be able to identify multiple types of identities. In order to achieve an effective identity strategy, your system will need to accept identities of the older formats such as X.509, SAML, WS-Security as well as the newer breed of OAuth, Open ID, etc. In addition your identity systems must  mediate the identities, as an Identity Broker, so it can securely and efficiently relate these credentials to your API to consume.

You will need to have identity-based governance policies in place. These policies need to be enforced globally not just locally. Effectively this means you need to have predictable results that are reproducible regardless of where you deploy your policies. Once the user is identified and authenticated, then you can use that result to authorize the user based on not only that credential, but also based on the location where the invocation came from, time of the day, day of the week, etc. Furthermore, for highly sensitive systems the data or user can be classified as well. Top secret data can be accessed only by top classified credentials, etc. In order to build very effective policies and govern them at run time, you need to integrate with a mature policy decision engine. It can be either standard based, such as XACML, or integrated with an existing legacy system provider

Protect Data

Protect your data as if your business depends on it, as it often does, or should. Make sure that the sensitive data, whether in transit or at rest (storage), is not in an unprotected original format. While there are multiple ways the data can be protected, the most common ones are encryption or tokenization. In the case of encryption, the data will be encrypted, so only authorized systems can decrypt the data back to its original form. This will allow the data to circulate encrypted and decrypt as necessary along the way by secured steps. While this is a good solution for many companies you need be careful about the encryption standard you choose, your key management and key rotation policies. The other standard “tokenization” is based on the fact you can’t steal what is not there. You can basically tokenize anything from PCI, PII or PHI information. The original data is stored in a secure vault and a token (or pointer, representing the data) will be sent in transit down stream. The advantage is that if any unauthorized party gets hold of the token, they wouldn’t know where to go to get the original data, let alone have access to the original data. Even if they do know where the token data is located, they are not white listed, so the original data is not available to them. The greatest advantage with tokenization systems is that it reduces the exposure scope throughout your enterprise, as you have eliminated vulnerabilities throughout the system by eliminating the sensitive and critical data from the stream thereby centralizing your focus and security upon the stationary token vault rather than active, dynamic and pliable data streams.. While you’re at it, you might want to consider a mechanism, such as DLP, which is highly effective in monitoring for sensitive data leakage. This process can automatically tokenize or encrypt the sensitive data that is going out. You might also want to consider policy based information traffic control. While certain groups of people may be allowed to communicate certain information (such as company financials by an auditor,etc) the groups may not be allowed to send that information. You can also enforce that by a location based invocation (ie. intranet users vs. mobile users who are allowed to get certain information).

QOS

While APIs exposed in the cloud can let you get away with scalability from a expansion or a burst during peak hours, it is still a good architectural design principle to make sure that you limit or rate access to your API. This is especially valuable  if you are offering an open API and exposure to anyone, which is an important and valuable factor. There are 2 sides to this; a business side and a technical side. The technical side will allow your APIs to be consumed in a controlled way and the business side will let you negotiate better SLA contracts based on usage model you have handy. You also need to have a flexible throttling mechanism that can help you implement this more efficiently such as just notify, throttle the excessive traffic, shape the traffic by holding the messages until the next sampling period starts, etc. In addition, there should be a mechanism to monitor and manage traffic both for long term and for short term which can be based on 2 different policies.

Protect your API

The attacks or misuse of  your publicly exposed API can be intentional or accidental. Either way you can’t afford for anyone to bring your API down. You need to have application aware firewalls that can look into the application level messages and prevent attacks. Generally the application attacks tend to fall under Injection attacks (SQL Injection, Xpath injection, etc), Script attacks, or attack on the Infrastructure itself.

Message Security

You also need to provide both transport level and message level security features. While transport security features such as SSL, TSL provide some data privacy you need to have an option to encrypt/ sign message traffic, so it will reach the end systems safely and securely and can authenticate the end user who sent the message.

Imagine if you can provide all of the above in one package. Just take it out of the packaging, power it up, and with a few configuration steps provide most of what we have discussed above?  More importantly in a matter of hours you’ve hardened your API to your enterprise level (or in some cases better than that). Intel has such a solution to offer.

Check out our Intel API gateway solution which offers all of those hardening features, in one package and a whole lot more. Feel free to reach out to me if you have any questions or need additional info.

http://cloudsecurity.intel.com/solutions/cloud-service-brokerage-api-resource-center

 

 

Andy Thurai — Chief Architect & CTO, Application Security and Identity Products, Intel

Andy Thurai is Chief Architect and CTO of Application Security and Identity Products with Intel, where he is responsible for architecting SOA, Cloud, Governance, Security, and Identity solutions for their major corporate customers. In his role, he is responsible for helping Intel/McAfee field sales, technical teams and customer executives. Prior to this role, he has held technology architecture leadership and executive positions with L-1 Identity Solutions, IBM (Datapower), BMC, CSC, and Nortel. His interests and expertise include Cloud, SOA, identity management, security, governance, and SaaS. He holds a degree in Electrical and Electronics engineering and has over 20+ years of IT experience.

He blogs regularly at www.thurai.net/securityblog on Security, SOA, Identity, Governance and Cloud topics. You can find him on LinkedIn

Read the original blog entry...

More Stories By Application Security

This blog references our expert posts on application and web services security.

@ThingsExpo Stories
Large scale deployments present unique planning challenges, system commissioning hurdles between IT and OT and demand careful system hand-off orchestration. In his session at @ThingsExpo, Jeff Smith, Senior Director and a founding member of Incenergy, will discuss some of the key tactics to ensure delivery success based on his experience of the last two years deploying Industrial IoT systems across four continents.
There will be new vendors providing applications, middleware, and connected devices to support the thriving IoT ecosystem. This essentially means that electronic device manufacturers will also be in the software business. Many will be new to building embedded software or robust software. This creates an increased importance on software quality, particularly within the Industrial Internet of Things where business-critical applications are becoming dependent on products controlled by software. Qua...
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2016 Silicon Valley. The 19th Cloud Expo and 6th @ThingsExpo will take place on November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Interne...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to imp...
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges - cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
SYS-CON Events announced today that MangoApps will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. MangoApps provides modern company intranets and team collaboration software, allowing workers to stay connected and productive from anywhere in the world and from any device.
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
“delaPlex Software provides software outsourcing services. We have a hybrid model where we have onshore developers and project managers that we can place anywhere in the U.S. or in Europe,” explained Manish Sachdeva, CEO at delaPlex Software, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.
From wearable activity trackers to fantasy e-sports, data and technology are transforming the way athletes train for the game and fans engage with their teams. In his session at @ThingsExpo, will present key data findings from leading sports organizations San Francisco 49ers, Orlando Magic NBA team. By utilizing data analytics these sports orgs have recognized new revenue streams, doubled its fan base and streamlined costs at its stadiums. John Paul is the CEO and Founder of VenueNext. Prior ...
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effi...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.