Quantcast
Channel: VMware Communities : Discussion List - All Communities
Viewing all articles
Browse latest Browse all 193198

Store duplicate files only once to boost compression rates

$
0
0

Microsoft Office, DVDVideoSoft's Free Studio and other software can contain many duplicate files (graphical resources in themes for example), but their installers turn out much smaller than when expanded/extracted, and I can also achieve this goal when creating an ISO disc image - there's an option to optimize the file-system by only auto-detecting and storing duplicate entries once, which might not compress Free Studio back to the 60 MiB (or so) size of the original installer, but it could bring it from 1 GiB to around 100 MiB... This particular case may be just an extreme example, but I think this optimization would have the potential of significantly improving compression rates in many cases.

 

I'm not sure if I've seen it in any popular compression/archiving/backup software, but in this case it would be useful too because the file repository (the .dat file) never changes anyway. Users should also have another option to optimize space utilization with extracted files too, by using NTFS symbolic/hard links, and maybe even be allowed to choose to have the application's final location/folder's contents compressed by default (as in choosing compress this folder and its contents within the OS). Or is this kind of optimization performed automatically by the compression engine when using larger block-sizes?

 

I'm asking because wasting resources is practically never acceptable from my point of view and in this case it could make a difference (size does matter).


Viewing all articles
Browse latest Browse all 193198

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>