Data compression is the compacting of info by reducing the number of bits which are stored or transmitted. This way, the compressed info needs less disk space than the initial one, so extra content might be stored on identical amount of space. There're various compression algorithms which function in different ways and with several of them only the redundant bits are erased, which means that once the info is uncompressed, there's no loss of quality. Others erase unnecessary bits, but uncompressing the data afterwards will result in lower quality compared to the original. Compressing and uncompressing content needs a large amount of system resources, and in particular CPU processing time, therefore every hosting platform which employs compression in real time should have adequate power to support this attribute. An example how data can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" how many sequential 1s or 0s there should be instead of saving the actual code.

Data Compression in Cloud Hosting

The compression algorithm used by the ZFS file system that runs on our cloud hosting platform is known as LZ4. It can upgrade the performance of any website hosted in a cloud hosting account with us as not only does it compress data much better than algorithms used by alternative file systems, but also uncompresses data at speeds which are higher than the hard disk drive reading speeds. This is achieved by using a lot of CPU processing time, which is not a problem for our platform owing to the fact that it uses clusters of powerful servers working together. A further advantage of LZ4 is that it enables us to generate backup copies much more quickly and on reduced disk space, so we shall have several daily backups of your databases and files and their generation will not affect the performance of the servers. In this way, we can always restore all content that you may have deleted by mistake.