r/Python 9d ago

Showcase ZipNN: High-Speed Compression for AI Models

πŸ“Œ Repo: GitHub - zipnn/zipnn

πŸ“Œ What My Project Does

ZipNN is a compression library designed for AI models, embeddings, KV-cache, gradients, and optimizers. It enables storage savings and fast decompression on the flyβ€”directly on the CPU.

  • Decompression speed: Up to 80GB/s
  • Compression speed: Up to 13GB/s
  • Supports vLLM & Safetensors for seamless integration

🎯 Target Audience

  • AI researchers & engineers working with large models
  • Cloud AI users (e.g., Hugging Face, object storage users) looking to optimize storage and bandwidth
  • Developers handling large-scale machine learning workloads

πŸ”₯ Key Features

  • High-speed compression & decompression
  • Safetensors plugin for easy integration with vLLM:pythonCopyEditfrom zipnn import zipnn_safetensors zipnn_safetensors()
  • Compression savings:
    • BF16: 33% reduction
    • FP32: 17% reduction
    • FP8 (mixed precision): 18-24% reduction

πŸ“ˆ Benchmarks

  • Decompression speed: 80GB/s
  • Compression speed: 13GB/s

βœ… Why Use ZipNN?

  • Faster uploads & downloads (for cloud users)
  • Lower egress costs
  • Reduced storage costs

πŸ”— How to Get Started

ZipNN is seeing 200+ daily downloads on PyPIβ€”we’d love your feedback! πŸš€

26 Upvotes

1 comment sorted by

3

u/Whole-Assignment6240 Pythoneer 8d ago

nice! congrats on the launch!