The needed Update KB3000850 to get the support, mentioned at the TechEd Barcelona 2014 Session about VDI, is coming across.
Windows Server Data Deduplication at Scale: Dedup Updates for Large-Scale VDI and Backup Scenarios
It’s included in the November 2014 update with roughly 700 megs.
You can find it via Windows Update or here.
Deduplication and virtualized backups were announced as new supported scenario at the TechEd 2014 embedded in a VDI and Deduplication session.
Actually it’s nothing new, it was basically working already and I’ve been doing this since the early days of preview 2012 R2.
But now it’s officially supported by Microsoft.
The basic idea is to lay down all backup VHDX made by DPM Server on Scale-Out-Fileservers.
So the SOF can do exactly the same what like in a VDI environment and dedup all the drives.
I can totally agree with dedup rates above 50% and up to 80% in real world scenarios.
It’s been a while since the dedup feature came to us with Server 2012.
Actually 2 years ago I started playing around and since then it’s one of my favourite features.
For testing purposes I have had deduped nearly everything I could and still use it in “unsupported” scenarios like non VDI environments.
I got quite an overview how rock solid this feature really is.
Server 2012 R2 fantaboulous dedup feature is one of my favourites.
But I had a mysterious error within the logs about “Not enough storage” which was definitely not true.
At the end the error message is misleading and it was to less memory at night times when the server tried to dedup his storage due to a rare condition with concurrent other VMs and dynamic ram. So it happend not often and the errors weren’t in my sight.
The Power Shell gave me this:
LastOptimizationTime : 24.06.2014 01:26:49
LastOptimizationResult : 0x8007000E
LastOptimizationResultMessage : Not enough storage is available to complete this operation.
Getting more minimum memory and scaling up the weight solved the problem.
DeDuplication is one of the coolest new features on Server 2012 and I can’t wait to get the new R2 with all improvements.
All (File) Servers I’m using are brand new setups but there is an older system which was 2008 R2 and upgraded inplace a couple of months before without worries.
There are many image files, drivers, backups laying around but the dedup rate was very low right from the beginning.
Usually I can see dedup rates around 40 – 50% on most fileservers. Sometimes much more especially If there are sys preps and Microsoft related stuff.
This one special fileserver was running out of space the second time and it seems there is a problem with the garbage collection.
I started a thread at the Technet Forum and got a couple of informations.
The interims solution for me at the moment is a powershell command to run the clean up manually and get space back.
PS C:\> Start-DedupJob E: –Type GarbageCollection -full
Within the next days I’m crawling a bit deeper and take a look into the system log trying to get more useful error messages or something.