Zhongyuan Institute of Artificial Intelligence
Lossless data compression by large models

This paper establishes a new data compression paradigm by demonstrating that "understanding is compression," where large generative models, through their deep data comprehension, achieve superior compression ratios. The LMCompress framework consistently outperforms traditional compression algorithms, often doubling or quadrupling performance across images, video, audio, and text by leveraging these models' predictive power.

View blog
Resources5
There are no more papers matching your filters at the moment.