Nacrith:通过集成上下文建模和高精度CDF编码实现神经无损压缩 / Nacrith: Neural Lossless Compression via Ensemble Context Modeling and High-Precision CDF Coding
1️⃣ 一句话总结
这篇论文提出了一个名为Nacrith的高效无损压缩系统,它通过结合一个大型语言模型、多个轻量级预测器以及多项创新技术,在文本和二进制文件压缩上取得了比传统方法更好的压缩率,同时还能在普通消费级显卡上快速运行。
We present Nacrith, a lossless compression system that combines a 135M-parameter transformer language model (SmolLM2-135M) with an ensemble of lightweight online predictors and a 32-bit arithmetic coder. Beyond the base LLM-plus-arithmetic-coding paradigm, Nacrith introduces several contributions: (1) a CDF precision upgrade from 2^16 to 2^24 that eliminates ~75% of quantization overhead caused by minimum-probability floors in large vocabularies; (2) a token-level N-gram model for fast local predictions; (3) an adaptive log-space bias head correcting per-document LLM errors via online gradient descent; (4) confidence-based LLM skip for accelerating highly predictable tokens; (5) a hybrid binary format (NC06) extending neural compression to arbitrary binary files--to our knowledge a first among LLM-based compressors; (6) a this http URL inference backend achieving ~7x faster single-token decode than PyTorch; (7) parallel multi-GPU compression across up to 8 workers; and (8) native KV cache sliding window reducing per-slide cost by ~37x. The system requires only ~500 MB of GGUF weights and ~1.2 GB VRAM per worker, running on consumer GPUs. On this http URL (Canterbury Corpus, 152 KB), Nacrith achieves 0.918 bits per byte (bpb)--outperforming gzip by 3.1x, bzip2 by 2.5x, CMIX v21 by 44%, and ts_zip by 20%, while compressing below the 0th-, 1st-, and 2nd-order byte-level Shannon entropy bounds. On enwik8 (100 MB), Nacrith achieves 0.9389 bpb (11.74%), surpassing ts_zip (~1.11 bpb) by 15% and FineZip (1.024 bpb) by 8% despite using a 60x smaller model with no fine-tuning. An out-of-distribution evaluation on a document published after the model's training cutoff confirms these gains are not memorization artifacts, achieving 0.723 bpb on unseen text.
Nacrith:通过集成上下文建模和高精度CDF编码实现神经无损压缩 / Nacrith: Neural Lossless Compression via Ensemble Context Modeling and High-Precision CDF Coding
这篇论文提出了一个名为Nacrith的高效无损压缩系统,它通过结合一个大型语言模型、多个轻量级预测器以及多项创新技术,在文本和二进制文件压缩上取得了比传统方法更好的压缩率,同时还能在普通消费级显卡上快速运行。
源自 arXiv: 2602.19626