Data compression is the decrease of the number of bits that should be stored or transmitted and this particular process is quite important in the internet hosting field due to the fact that information kept on hard drives is often compressed in order to take less space. You can find many different algorithms for compressing info and they offer different effectiveness based upon the content. A lot of them remove just the redundant bits, so that no data can be lost, while others erase unnecessary bits, which results in worse quality once your data is uncompressed. The method requires a lot of processing time, so a web hosting server should be powerful enough to be able to compress and uncompress data right away. An instance how binary code could be compressed is by "remembering" that there are five consecutive 1s, for example, in contrast to storing all five 1s.

Data Compression in Hosting

The compression algorithm that we use on the cloud internet hosting platform where your new hosting account will be created is named LZ4 and it is used by the state-of-the-art ZFS file system which powers the platform. The algorithm is a lot better than the ones other file systems employ because its compression ratio is much higher and it processes data significantly quicker. The speed is most noticeable when content is being uncompressed since this happens quicker than info can be read from a hdd. As a result, LZ4 improves the performance of any site located on a server which uses this algorithm. We take full advantage of LZ4 in an additional way - its speed and compression ratio let us generate several daily backup copies of the entire content of all accounts and store them for thirty days. Not only do the backup copies take less space, but also their generation does not slow the servers down like it often happens with many other file systems.

Data Compression in Semi-dedicated Hosting

The ZFS file system which runs on the cloud platform where your semi-dedicated hosting account will be created uses a powerful compression algorithm called LZ4. It is among the best algorithms out there and positively the most efficient one when it comes to compressing and uncompressing web content, as its ratio is very high and it will uncompress data at a faster rate than the same data can be read from a hard drive if it were uncompressed. That way, using LZ4 will boost any kind of Internet site that runs on a platform where this algorithm is present. This high performance requires plenty of CPU processing time, which is provided by the great number of clusters working together as a part of our platform. In addition to that, LZ4 makes it possible for us to generate several backup copies of your content every day and have them for one month as they will take much less space than typical backups and will be generated much quicker without loading the servers.