whisper.cpp
by Georgi Gerganov
C/C++ port of Whisper — runs on anything, from a Raspberry Pi to Apple Silicon.
Category
Open source
License
MIT
Stars
★ 48.8k
Last push
2026-04-20
Pricing
free
Platforms
macOS, Linux, Windows, iOS, Android, Edge
What it is
whisper.cpp is a dependency-free C/C++ port of Whisper. No PyTorch, no CUDA — it runs everywhere and is the fastest Whisper option on Apple Silicon thanks to the Metal backend. The project is also the upstream of the popular llama.cpp approach. Perfect when you need privacy-preserving, offline transcription on consumer hardware.
Best for: Offline / on-device transcription, Apple Silicon Metal acceleration, low-RAM targets.
Watch out for: No speaker diarization out of the box; model management is manual; diarization needs external pyannote.
Watch out for: No speaker diarization out of the box; model management is manual; diarization needs external pyannote.
Install / use
git clone https://github.com/ggerganov/whisper.cpp && make
Features
| Speaker diarization | No |
| Word-level timestamps | Yes |
| Streaming / real-time | Yes |
| Languages supported | 99 |
| HIPAA eligible | No |
Links
Alternatives
Whipscribe is a managed faster-whisper + whisperX service. If you want transcripts without running infrastructure, we're one click away.
Try Whipscribe →