sccache

A shared compilation cache that wraps compilers like rustc, clang, and gcc to cache build artifacts locally or in cloud storage, dramatically speeding up repeated builds.

sccache is a compiler caching tool in the spirit of ccache, but with first-class support for Rust (rustc), C/C++ (gcc, clang, MSVC), and CUDA. It can cache compilation artifacts locally on disk or remotely in cloud storage (S3, GCS, Azure Blob, Redis), making it invaluable for CI pipelines and large multi-crate workspaces where recompilation is the biggest time sink.

Features

  • Rust support — wraps rustc and integrates natively with Cargo via the RUSTC_WRAPPER environment variable
  • C/C++ support — works as a drop-in wrapper for gcc, clang, and MSVC
  • Multiple storage backends — local disk, Amazon S3, Google Cloud Storage, Azure Blob Storage, and Redis
  • Distributed compilation — optional distributed mode where compile jobs are farmed out to a pool of worker machines
  • Cross-platform — runs on Linux, macOS, and Windows
  • Statisticssccache --show-stats gives a clear picture of cache hit rates and time saved
  • Zero code changes — integrates entirely via environment variables, no changes to your Cargo.toml or build scripts required

Installation

cargo install sccache

Or download a pre-built binary from the releases page:

# Debian / Ubuntu
apt install sccache

# Fedora
dnf install sccache

# macOS
brew install sccache

# Arch Linux
pacman -S sccache

# Nix
nix-env -iA nixpkgs.sccache

Setup

Rust / Cargo

Set the RUSTC_WRAPPER environment variable to point at sccache, and Cargo will automatically use it:

export RUSTC_WRAPPER=sccache

Add this to your ~/.bashrc, ~/.zshrc, or equivalent to make it permanent. Alternatively, set it in ~/.cargo/config.toml to apply it to all projects:

[build]
rustc-wrapper = "sccache"

C/C++

Use sccache as a compiler wrapper directly:

CC="sccache gcc" CXX="sccache g++" cmake ..

Cloud Storage (S3 example)

export SCCACHE_BUCKET=my-sccache-bucket
export SCCACHE_REGION=us-east-1
export AWS_ACCESS_KEY_ID=...
export AWS_SECRET_ACCESS_KEY=...
export RUSTC_WRAPPER=sccache

With cloud storage configured, any CI runner that shares the same bucket will benefit from each other's cached artifacts — the most impactful configuration for large teams or monorepos.

Usage

# Show cache statistics (hit rate, bytes stored, time saved)
sccache --show-stats

# Clear the local cache
sccache --zero-stats

# Stop the sccache server (it runs as a background daemon)
sccache --stop-server

# Show where the cache is stored
sccache --show-adv-stats

Typical CI Configuration

# GitHub Actions example
- name: Install sccache
  uses: mozilla-actions/sccache-action@v0.0.5

- name: Build
  env:
    RUSTC_WRAPPER: sccache
    SCCACHE_GHA_ENABLED: "true"
  run: cargo build --release

The sccache-action handles installation and uses GitHub Actions' own cache storage backend automatically — no S3 bucket required.

Impact

On a cold cache, sccache adds negligible overhead. On a warm cache, hitting even 50–60% of crates can cut build times in half. In CI environments where dependency trees rarely change between runs, hit rates above 80% are common.