Microsoft announced different cool features at the TechEd 2014 in Barcelona and I needed some time to get into all the cool new stuff.
Especially the tracks of Data Protection and High Availability got me.
There are major and minor changes with Hyper-V, Data Protection Manager and Azure which makes life easier and HA/DP better.
Here’s a series of conclusions of the changes and new features from all over different sessions.
2. A new Reporting Framework
Another missed thing, is an accurate and reliable reporting feature.
At the moment DPM doesn’t have good reports especially for environments with more than one DPM server.
The reporting engine is only usable for one single server for example.
I’m sure you really desperate to get something better.
Microsoft announced a number of cool features at the TechEd 2014 in Barcelona and I needed sometime to get into all the hot new stuff.
I found the Data Protection and High Availability tracks especially captivating.
There are major and minor changes to Hyper-V, Data Protection Manager and Azure which make admin life easier and HA/DP better.
Here’s a series of summaries on the changes and new features from all over the different sessions:
1. Virtualized Backups
They’re finally in. Rumors came up this summer and most people guessed the UR3 for DPM 2012
R2 would add a dedup feature to DPM.
As we now know, it was not available in the original UR3 but is now fully supported after the November 2014 update.
The reason why dedup was a painfully missed feature is the huge amount of data required for DPM compared with other vendors’ solutions.
Almost of the people to whom I talked about DPM agreed: “We need dedup!”
A channel9 session with Dik van Brummen points this out nicely.
If you’re a SQL Server guy maybe it’s not a very surprising fact, but to me it is.
I’m using gMSAs since they were introduced with Server 2012 quite often and almost for SQL Server services.
I missed the older MSA – Managed Service Accounts (or standalone/sMSA as they known since last year) and never used them as they came along with Server 2008 R2.
But a couple of days ago, a blog entry came across, where one of the SQL team member said it’s actually not supported to use gMSA for SQL Servers, written on February 2014, hugh.
Sometimes i’m creating scheduled replication tasks because i’m still working on servers and systems or i have performance considerations.
Surprisingly often I’m finished earlier than expected and would like to start the postponed tasks as soon as possible.
But i couldn’t find DPM specific tasks within the task scheduler and starting the job with the usual way via GUI wasn’t helpful. Continue reading →
For the device-specific module (DSM) named Microsoft DSM, versions do not match between node CL-N01.contoso.com and node CL-N02.contoso.com. The Cluster validation of a couple of Servers 2012 R2 failed with VMM and the Failover Cluster Manager as well. Continue reading →
Ok ok… I already know about the basic problem that only a couple of companys have a direct Access to the tpm module and the secret keys.
at the 30C3 in Hamburg I saw the talk ok Dr. Prof. Rüdiger Weis – Kryptographie nach Snowden which gave me a good reason to think how safe it is to stay on Windows Systems.
To get it clear… I mistrust every Company doing security stuff for Money.
It doesn’t matter to me how big or small a Company is.
What we definitely know after the leaks of Edward Snowden is, that everything is possible and tried to get in by Security Agencys all over the world.
But there is a Problem… Windows is still pretty strong on Desktops and Notebooks so what ever we do as “more professional users”, we have to stay close and give Microsoft a reason to do the best in security for all users who are working with Windows.
So hey… If you ever have the Chance to speak with Microsoft employees or MVPs or who ever. Declare your Position to get the most secure System you’ll can get.
For me I’m start re-thinking about using a Linux System as well as my Win/OSXes.
DeDuplication is one of the coolest new features on Server 2012 and I can’t wait to get the new R2 with all improvements.
All (File) Servers I’m using are brand new setups but there is an older system which was 2008 R2 and upgraded inplace a couple of months before without worries.
There are many image files, drivers, backups laying around but the dedup rate was very low right from the beginning.
Usually I can see dedup rates around 40 – 50% on most fileservers. Sometimes much more especially If there are sys preps and Microsoft related stuff.
This one special fileserver was running out of space the second time and it seems there is a problem with the garbage collection.
I started a thread at the Technet Forum and got a couple of informations.
As I mentioned before I was using Windows Intune and wasn’t very happy about Wave D.
But I decided to try it again with a more productive system.
Maybe I’m getting a budget to try it out more than the typical 30 day trial period but we will see.
But before I’m starting to enroll something on a (third level) productive machine I want to know how to wipe it out if I have troubles or done enough testing.
First option is to retire the device which means Intune will create a schedule task and deinstall all software and reset all changes made during Intune use.
But if there is a deeper problem or you can’t reach the client anymore you can use a script made by the Intune Team.
Here’s the official Technet Website.
I tried it a couple of weeks before on several machines and the job was done well enough.
Scroll down below and read
Using the Windows Intune uninstall scripts or the Windows Intune command line tool