Data compression is the lowering of the number of bits which have to be stored or transmitted and the process is really important in the internet hosting field because information located on HDDs is usually compressed to take less space. There are various algorithms for compressing data and they have different efficiency based upon the content. A lot of them remove just the redundant bits, so that no data will be lost, while others erase unneeded bits, which leads to worse quality once the data is uncompressed. This method employs plenty of processing time, which means that a web hosting server has to be powerful enough in order to be able to compress and uncompress data immediately. One example how binary code can be compressed is by "remembering" that there are five sequential 1s, for example, rather than storing all five 1s.

Data Compression in Cloud Web Hosting

The ZFS file system that is run on our cloud web hosting platform uses a compression algorithm called LZ4. The latter is substantially faster and better than any other algorithm on the market, particularly for compressing and uncompressing non-binary data i.e. web content. LZ4 even uncompresses data quicker than it is read from a hard disk, which improves the overall performance of sites hosted on ZFS-based platforms. Since the algorithm compresses data quite well and it does that very quickly, we're able to generate several backups of all the content kept in the cloud web hosting accounts on our servers daily. Both your content and its backups will require less space and since both ZFS and LZ4 work very quickly, the backup generation will not affect the performance of the web servers where your content will be stored.

Data Compression in Semi-dedicated Hosting

In case you host your websites in a semi-dedicated hosting account from our firm, you can experience the advantages of LZ4 - the powerful compression algorithm employed by the ZFS file system that is behind our advanced cloud Internet hosting platform. What distinguishes LZ4 from all of the other algorithms out there is that it has a better compression ratio and it is much faster, particularly when it comes to uncompressing web content. It does that even faster than uncompressed info can be read from a hard drive, so your Internet sites will perform faster. The higher speed comes at the expense of using a considerable amount of CPU processing time, that's not an issue for our platform because it consists of a lot of clusters working together. Together with the improved performance, you'll have multiple daily backup copies at your disposal, so you can restore any deleted content with several clicks. The backup copies are available for an entire month and we can afford to store them since they take significantly less space than standard backups.