Data compression is the compacting of information by lowering the number of bits which are stored or transmitted. In this way, the compressed info will require substantially less disk space than the original one, so much more content can be stored using the same amount of space. You'll find various compression algorithms that function in different ways and with a lot of them just the redundant bits are deleted, therefore once the data is uncompressed, there is no loss of quality. Others remove excessive bits, but uncompressing the data later will lead to reduced quality compared to the original. Compressing and uncompressing content needs a significant amount of system resources, in particular CPU processing time, so each and every Internet hosting platform which employs compression in real time needs to have enough power to support that feature. An example how info can be compressed is to substitute a binary code such as 111111 with 6x1 i.e. "remembering" how many consecutive 1s or 0s there should be instead of keeping the whole code.