Memory Requirements for DS-Client During Incremental Backup Using Delta Algorithm.
This article describes the memory requirements for DS-Client when performing incremental backup files using the Delta algorithm.
Processing files with Delta algorithm is very resource intensive (CPU, memory) and DS-Client needs to have enough memory to successfully complete. The memory requirements are about 0.25% from the size of the files processed at a given time (please note that a backup activity could process more than one file at once). For example, if DS-Client is processing a 100GB file, it will require 3 un-fragmented memory blocks totalling about 0.25GB or about 250MB of memory.
Sometimes the memory may become fragmented, and DS-Client may not be able to allocate the necessary amount; in such scenarios DS-Client will report Out of memory exceptions.
Here are a few things you can do if you encounter this situation:
- Add more physical memory.
- On a DS-Client backing up a big amount of data (either big files, or a big number of files, or both), SQL Server and the DS-Client service use most of that computer’s memory. Restarting these services will release some memory otherwise cached and it will reduce the memory fragmentation.
DS-Clients running on 64-bit support Master/Delta processing for files between 32 KB and 1 PB (1,024 TB) size.
IMPORTANT: DS-Client will require a sufficient amount of RAM to process any large file. The raw processing overhead in RAM of a single such file is about 0.3-1% of the file size for the 64-bit DS-Client.
For example, to protect a 1 TB file, the 64-bit DS-Client will require the amount of RAM+SWAP to be 1% x 1TB = 10GB for delta processing (about 0.3 x 1TB = 3GB of RAM minimum recommended).
Was this article helpful?
That’s Great!
Thank you for your feedback
Sorry! We couldn't be helpful
Thank you for your feedback
Feedback sent
We appreciate your effort and will try to fix the article