Welcome!

Microsoft Cloud Authors: Liz McMillan, John Basso, Pat Romanski, Elizabeth White, Mihai Corbuleac

Blog Feed Post

Windows Server 2012 – New Advanced Features

In this article I would like to share the new things in Windows Server 2012 that grabbed my particular attention. It’s not a full list of the new features, which you can find on the Microsoft official site. It’s more like a summary of the more advanced and intriguing new features.

Live migrations

Windows Server 2008 R2 supported live migration, but only if the virtual hard disk’s location remained the same, i.e. SAN. What Windows Server 2012 brings to the scene is the ability to move a virtual machine outside a cluster environment to any other Hyper-V host. You can even move several machines at the same time. The only thing you would need is a shared folder accessible from both locations and then you could move the storage (storage migration) of a virtual machine to a new one. Windows Server 2012 even offers you the ability to do a “Shared Nothing” live migration, meaning the ability to migrate a virtual machine from one host to another, even if they have no connectivity between themselves.

Minimum bandwidth

In a typical virtualization infrastructure there are multiple virtual machines sharing the same physical network card. In periods of heavy loads one virtual machine can manipulate the traffic if it needs it, leaving insufficient traffic for the rest of the machines. In Windows Server 2008 R2 you were able to set the maximum bandwidth for each virtual machine, meaning that they couldn’t occupy more of their allocated bandwidth, even if they needed to. However, it was inefficient in situations when the other virtual machines didn’t actually need the rest of the bandwidth. Setting a minimum bandwidth in Windows Server 2012 allows you to specify how much bandwidth each virtual machine needs in order to function. However, these constraints are applied only when there is a conflict in the bandwidth needs of virtual machines. If there is free bandwidth each virtual machine may use it until other virtual machines that are under their minimum bandwidth need it.
Let’s say we have a 1 Gigabit Ethernet card. We specify the minimum bandwidths for Virtual Machine (VM) 1, VM2, and VM3 to be respectively 500 Mb, 300 Mb, and 200 Mb (the sum can’t exceed the total bandwidth of the Ethernet card). In a moment of less activity from VM2 and VM3, VM1 uses 700 Mb of the available bandwidth, VM2 and VM3 use 100 Mb each. However, in the next moment a transaction is processed to VM2 and it needs all its available bandwidth. When that happens, VM2 will first occupy the available 100 Mb, but because it still needs more bandwidth and it’s under its minimum bandwidth of 300 Mb, V1 (as it exceeds its minimum bandwidth) will have to give VM2 100 Mb more.

Network virtualization

What it allows you to do is to have multiple virtual networks, possibly with the same IP address schemes, on top of the same physical network. It is really useful for cloud services providers. However, it can be used in businesses as well, for example when HR or Payroll traffic should be totally separated from the rest of the traffic. It also allows you to move virtual machines wherever you need them, despite the physical network, even to the cloud. For this to be possible each virtual machine has two different addresses for each network adapter. One is used for communication with the rest of the virtual machines and hosts in that network and is called Client Address. The other one is called Provider Address and it’s used for communications on the physical network only. These addresses are given so that different clients/departments have specific addresses so that the provider knows which traffic comes from which client/department. This way, the traffic is completely isolated from any other traffic on the physical network.

Resource metering

It allows you to easily do your capacity planning, because it collects information for the use of resources of a virtual machine throughout a period of time. Furthermore, Windows Server 2012 introduces the concept of resource pools. They combine multiple virtual machines belonging to one specific client or used for one specific function and the resource metrics are collected on a per user/function basis. This technique is helpful for IT budgeting needs and for billing customers. The metrics usually being collected are: Average CPU use (for a selected period of time); Average memory use; Minimum memory use; Maximum memory use; Maximum disk allocation; Incoming network traffic; Outgoing network traffic.

Dynamic Host Configuration Protocol (DHCP)

A rogue DHCP server is a fake server which is connected to the network, collects DHCP server requests and responds with incorrect addressing information. Active Directory protects its DHCP service by not allowing other DHCP servers to operate on the network until they are authenticated. However, this does not apply for non-Microsoft-Windows DHCP servers. They could still connect to the network and give addresses. Windows Server 2012 limits this by allowing you to specify which ports can have DHCP servers attached. So if the intruder is attached to any other port, its DHCP fake server packets would be dropped.

Snapshots

They are mainly used when you need point-in-time recovery in case of an error. For example, when you apply a service pack on a production server, you may want to give yourself a backdoor in case something bad happens. You take the snapshot before the service pack installation and if needed, you recover the server with it. What happens if you don’t need it? You’ve monitored the server for a while and everything seems normal after the patch? You don’t need the screenshot anymore. However, in Windows Server 2008 R2 you couldn’t just get rid of it. You would have to pause the virtual machine for a while, making it inaccessible. Windows Server 2012 has a new feature, called Hyper-V Live Merge, which allows you to release the snapshot while the machine continues to run.

Stay tuned to Monitis for our future articles on Windows Server 2012. We will take a deeper look into these and some more advanced new features.

Share Now:del.icio.usDiggFacebookLinkedInBlinkListDZoneGoogle BookmarksRedditStumbleUponTwitterRSS

Read the original blog entry...

More Stories By Hovhannes Avoyan

Hovhannes Avoyan is the CEO of PicsArt, Inc.,

@ThingsExpo Stories
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2016 Silicon Valley. The 19th Cloud Expo and 6th @ThingsExpo will take place on November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Interne...
Large scale deployments present unique planning challenges, system commissioning hurdles between IT and OT and demand careful system hand-off orchestration. In his session at @ThingsExpo, Jeff Smith, Senior Director and a founding member of Incenergy, will discuss some of the key tactics to ensure delivery success based on his experience of the last two years deploying Industrial IoT systems across four continents.
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
SYS-CON Events announced today that MangoApps will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. MangoApps provides modern company intranets and team collaboration software, allowing workers to stay connected and productive from anywhere in the world and from any device.
The IETF draft standard for M2M certificates is a security solution specifically designed for the demanding needs of IoT/M2M applications. In his session at @ThingsExpo, Brian Romansky, VP of Strategic Technology at TrustPoint Innovation, explained how M2M certificates can efficiently enable confidentiality, integrity, and authenticity on highly constrained devices.
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
In today's uber-connected, consumer-centric, cloud-enabled, insights-driven, multi-device, global world, the focus of solutions has shifted from the product that is sold to the person who is buying the product or service. Enterprises have rebranded their business around the consumers of their products. The buyer is the person and the focus is not on the offering. The person is connected through multiple devices, wearables, at home, on the road, and in multiple locations, sometimes simultaneously...
“delaPlex Software provides software outsourcing services. We have a hybrid model where we have onshore developers and project managers that we can place anywhere in the U.S. or in Europe,” explained Manish Sachdeva, CEO at delaPlex Software, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.
From wearable activity trackers to fantasy e-sports, data and technology are transforming the way athletes train for the game and fans engage with their teams. In his session at @ThingsExpo, will present key data findings from leading sports organizations San Francisco 49ers, Orlando Magic NBA team. By utilizing data analytics these sports orgs have recognized new revenue streams, doubled its fan base and streamlined costs at its stadiums. John Paul is the CEO and Founder of VenueNext. Prior ...
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
I wanted to gather all of my Internet of Things (IOT) blogs into a single blog (that I could later use with my University of San Francisco (USF) Big Data “MBA” course). However as I started to pull these blogs together, I realized that my IOT discussion lacked a vision; it lacked an end point towards which an organization could drive their IOT envisioning, proof of value, app dev, data engineering and data science efforts. And I think that the IOT end point is really quite simple…
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...