Welcome!

Microsoft Cloud Authors: Pat Romanski, Lori MacVittie, Andreas Grabner, Jim Kaskade, John Basso

Blog Feed Post

Windows Server 2012 – New Advanced Features

In this article I would like to share the new things in Windows Server 2012 that grabbed my particular attention. It’s not a full list of the new features, which you can find on the Microsoft official site. It’s more like a summary of the more advanced and intriguing new features.

Live migrations

Windows Server 2008 R2 supported live migration, but only if the virtual hard disk’s location remained the same, i.e. SAN. What Windows Server 2012 brings to the scene is the ability to move a virtual machine outside a cluster environment to any other Hyper-V host. You can even move several machines at the same time. The only thing you would need is a shared folder accessible from both locations and then you could move the storage (storage migration) of a virtual machine to a new one. Windows Server 2012 even offers you the ability to do a “Shared Nothing” live migration, meaning the ability to migrate a virtual machine from one host to another, even if they have no connectivity between themselves.

Minimum bandwidth

In a typical virtualization infrastructure there are multiple virtual machines sharing the same physical network card. In periods of heavy loads one virtual machine can manipulate the traffic if it needs it, leaving insufficient traffic for the rest of the machines. In Windows Server 2008 R2 you were able to set the maximum bandwidth for each virtual machine, meaning that they couldn’t occupy more of their allocated bandwidth, even if they needed to. However, it was inefficient in situations when the other virtual machines didn’t actually need the rest of the bandwidth. Setting a minimum bandwidth in Windows Server 2012 allows you to specify how much bandwidth each virtual machine needs in order to function. However, these constraints are applied only when there is a conflict in the bandwidth needs of virtual machines. If there is free bandwidth each virtual machine may use it until other virtual machines that are under their minimum bandwidth need it.
Let’s say we have a 1 Gigabit Ethernet card. We specify the minimum bandwidths for Virtual Machine (VM) 1, VM2, and VM3 to be respectively 500 Mb, 300 Mb, and 200 Mb (the sum can’t exceed the total bandwidth of the Ethernet card). In a moment of less activity from VM2 and VM3, VM1 uses 700 Mb of the available bandwidth, VM2 and VM3 use 100 Mb each. However, in the next moment a transaction is processed to VM2 and it needs all its available bandwidth. When that happens, VM2 will first occupy the available 100 Mb, but because it still needs more bandwidth and it’s under its minimum bandwidth of 300 Mb, V1 (as it exceeds its minimum bandwidth) will have to give VM2 100 Mb more.

Network virtualization

What it allows you to do is to have multiple virtual networks, possibly with the same IP address schemes, on top of the same physical network. It is really useful for cloud services providers. However, it can be used in businesses as well, for example when HR or Payroll traffic should be totally separated from the rest of the traffic. It also allows you to move virtual machines wherever you need them, despite the physical network, even to the cloud. For this to be possible each virtual machine has two different addresses for each network adapter. One is used for communication with the rest of the virtual machines and hosts in that network and is called Client Address. The other one is called Provider Address and it’s used for communications on the physical network only. These addresses are given so that different clients/departments have specific addresses so that the provider knows which traffic comes from which client/department. This way, the traffic is completely isolated from any other traffic on the physical network.

Resource metering

It allows you to easily do your capacity planning, because it collects information for the use of resources of a virtual machine throughout a period of time. Furthermore, Windows Server 2012 introduces the concept of resource pools. They combine multiple virtual machines belonging to one specific client or used for one specific function and the resource metrics are collected on a per user/function basis. This technique is helpful for IT budgeting needs and for billing customers. The metrics usually being collected are: Average CPU use (for a selected period of time); Average memory use; Minimum memory use; Maximum memory use; Maximum disk allocation; Incoming network traffic; Outgoing network traffic.

Dynamic Host Configuration Protocol (DHCP)

A rogue DHCP server is a fake server which is connected to the network, collects DHCP server requests and responds with incorrect addressing information. Active Directory protects its DHCP service by not allowing other DHCP servers to operate on the network until they are authenticated. However, this does not apply for non-Microsoft-Windows DHCP servers. They could still connect to the network and give addresses. Windows Server 2012 limits this by allowing you to specify which ports can have DHCP servers attached. So if the intruder is attached to any other port, its DHCP fake server packets would be dropped.

Snapshots

They are mainly used when you need point-in-time recovery in case of an error. For example, when you apply a service pack on a production server, you may want to give yourself a backdoor in case something bad happens. You take the snapshot before the service pack installation and if needed, you recover the server with it. What happens if you don’t need it? You’ve monitored the server for a while and everything seems normal after the patch? You don’t need the screenshot anymore. However, in Windows Server 2008 R2 you couldn’t just get rid of it. You would have to pause the virtual machine for a while, making it inaccessible. Windows Server 2012 has a new feature, called Hyper-V Live Merge, which allows you to release the snapshot while the machine continues to run.

Stay tuned to Monitis for our future articles on Windows Server 2012. We will take a deeper look into these and some more advanced new features.

Share Now:del.icio.usDiggFacebookLinkedInBlinkListDZoneGoogle BookmarksRedditStumbleUponTwitterRSS

Read the original blog entry...

More Stories By Hovhannes Avoyan

Hovhannes Avoyan is the CEO of PicsArt, Inc.,

@ThingsExpo Stories
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, discussed the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
IoT solutions exploit operational data generated by Internet-connected smart “things” for the purpose of gaining operational insight and producing “better outcomes” (for example, create new business models, eliminate unscheduled maintenance, etc.). The explosive proliferation of IoT solutions will result in an exponential growth in the volume of IoT data, precipitating significant Information Governance issues: who owns the IoT data, what are the rights/duties of IoT solutions adopters towards t...
Businesses and business units of all sizes can benefit from cloud computing, but many don't want the cost, performance and security concerns of public cloud nor the complexity of building their own private clouds. Today, some cloud vendors are using artificial intelligence (AI) to simplify cloud deployment and management. In his session at 20th Cloud Expo, Ajay Gulati, Co-founder and CEO of ZeroStack, will discuss how AI can simplify cloud operations. He will cover the following topics: why clou...
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
Everyone knows that truly innovative companies learn as they go along, pushing boundaries in response to market changes and demands. What's more of a mystery is how to balance innovation on a fresh platform built from scratch with the legacy tech stack, product suite and customers that continue to serve as the business' foundation. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, discussed why and how ReadyTalk diverted from healthy revenue and mor...
The many IoT deployments around the world are busy integrating smart devices and sensors into their enterprise IT infrastructures. Yet all of this technology – and there are an amazing number of choices – is of no use without the software to gather, communicate, and analyze the new data flows. Without software, there is no IT. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Dave McCarthy, Director of Products at Bsquare Corporation; Alan Williamson, Principal...
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
"IoT is going to be a huge industry with a lot of value for end users, for industries, for consumers, for manufacturers. How can we use cloud to effectively manage IoT applications," stated Ian Khan, Innovation & Marketing Manager at Solgeniakhela, in this SYS-CON.tv interview at @ThingsExpo, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Successful digital transformation requires new organizational competencies and capabilities. Research tells us that the biggest impediment to successful transformation is human; consequently, the biggest enabler is a properly skilled and empowered workforce. In the digital age, new individual and collective competencies are required. In his session at 19th Cloud Expo, Bob Newhouse, CEO and founder of Agilitiv, drew together recent research and lessons learned from emerging and established compa...
Financial Technology has become a topic of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 20th Cloud Expo at the Javits Center in New York, June 6-8, 2017, will find fresh new content in a new track called FinTech.
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O'Neill, Strategic Business Development at Nvidia, discussed the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to "every day," on-demand. They also reviewed two "free infrastructure" pr...
"Dice has been around for the last 20 years. We have been helping tech professionals find new jobs and career opportunities," explained Manish Dixit, VP of Product and Engineering at Dice, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
"At ROHA we develop an app called Catcha. It was developed after we spent a year meeting with, talking to, interacting with senior citizens watching them use their smartphones and talking to them about how they use their smartphones so we could get to know their smartphone behavior," explained Dave Woods, Chief Innovation Officer at ROHA, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2017 New York. The 20th Cloud Expo and 7th @ThingsExpo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Internet to enable us all to im...
We are always online. We access our data, our finances, work, and various services on the Internet. But we live in a congested world of information in which the roads were built two decades ago. The quest for better, faster Internet routing has been around for a decade, but nobody solved this problem. We’ve seen band-aid approaches like CDNs that attack a niche's slice of static content part of the Internet, but that’s it. It does not address the dynamic services-based Internet of today. It does...