Increasing Deduplication Performance on Server 2012/R2 … a bit

Updated 02.02..2015

It’s been a while since the dedup feature came to us with Server 2012.
Actually 2 years ago I started playing around and since then it’s one of my favourite features.
For testing purposes I have had deduped nearly everything I could and still use it in “unsupported” scenarios like non VDI environments.
I got quite an overview how rock solid this feature really is.

Let’s take a look on file servers they deliver storage space via SMB to DPM Servers for example.
The maintenance windows are usually really small but within the night times I have a couple of hours to dedup these file servers.

The common task schedule for deduplication is using only between 25 – 50% memory maximal. (background vs. through output)
You can find them here:

Task Scheduler Library / Microsoft / Windows / Deduplication

Note: If you never set up a dedup task there’s nothing to see there.
You have to schedule tasks via Gui/or PowerShell first.
After that, entrys will be made and you’re free to go.

Update 02.02.2015:

Within Failover Cluster Environment, (Scale-Out-File-Server/SOFS or regular Hyper-V- Cluster),
the DeDup Tasks will be find under the FailOver Cluster Directory.


But quite often I could and can see within the history, the tasks terminated frequently.


The exact length of 5 hours shows there is a pattern and yes.
It is not a fault it is actually an expected behavior.

DeDup GUI:

Task Schedule entry:

Every 7 a.m. the process is still running and got killed.
To increase the performance a bit you can set the amount of the memory higher than usual.
The full command for throughoutput optimizing is:

%systemroot%\system32\ddpcli.exe enqueue /opt /scheduled /vol * /priority normal /throttle none /memory 50

The PowerShell equivalent would be looking like that:

Start-DedupJob -Type Optimization -Full -InputOutputThrottleLevel None -Memory 50 -Priority Normal -Volume *

The /memory switch means % and I could go up to 90 to 95% of memory without service interruption.
But you have check what suits your environment best. Maybe it’s less before Fileservices dropping or get to slow.

A bit more forcefully PowerShell command is:

Start-DedupJob -Type Optimization -Full -InputOutputThrottleLevel None -Memory 90 -Preempt -Priority High -StopWhenSystemBusy -Volume *

So the memory amount pump up the dedup run and the chance to get through the time frame raises.

RAM beats time beats RAM