Korthane
Blog / rust cli

Why We Chose Rust for Our CLI Tools

After years of writing CLI tools in Go, we switched to Rust for a new generation of performance-critical utilities. Here is what drove the decision and what we learned.

| 2 min read

We have been writing Go professionally for the better part of a decade. It is a fantastic language for backend services — fast compilation, excellent concurrency primitives, and a standard library that covers most of what you need. So when we started building a new suite of CLI tools for data pipeline management, Go was the obvious choice.

Except this time, it wasn’t.

The Problem with GC Pauses

Our pipeline tools process large volumes of data in tight loops. Early prototypes in Go showed occasional latency spikes that traced back to garbage collection. For a long-running HTTP service, a 2ms GC pause is invisible. For a CLI tool that processes millions of records and needs predictable throughput, it matters.

// Zero-copy parsing with Rust's borrow checker
fn parse_record<'a>(input: &'a [u8]) -> Result<Record<'a>> {
    let header = Header::from_bytes(&input[..HEADER_SIZE])?;
    let payload = &input[HEADER_SIZE..HEADER_SIZE + header.len()];
    Ok(Record { header, payload })
}

Rust’s ownership model lets us write zero-copy parsers that would be unsafe (or impossible) in garbage-collected languages. The borrow checker verifies at compile time that our references are valid.

Error Handling That Scales

Go’s if err != nil pattern works well in small codebases. In a CLI tool with dozens of subcommands and complex error chains, Rust’s Result type and the ? operator produce cleaner code:

fn process_pipeline(config: &Config) -> Result<Stats> {
    let source = Source::connect(&config.source_url)?;
    let sink = Sink::connect(&config.sink_url)?;
    let mut stats = Stats::default();

    for batch in source.batches(config.batch_size) {
        let transformed = transform(batch?)?;
        stats.records += transformed.len();
        sink.write(transformed)?;
    }

    Ok(stats)
}

Every error is accounted for. The compiler ensures we handle every failure case.

Cross-Compilation and Distribution

With cargo build --target, we produce static binaries for Linux, macOS, and Windows from a single CI pipeline. No CGo complications, no glibc version mismatches. The resulting binaries are self-contained and start instantly.

The Trade-Off

Rust has a steeper learning curve. The borrow checker requires a different way of thinking about data ownership. Compilation is slower than Go. But for tools that will be used thousands of times per day by engineers who care about performance, the investment pays off quickly.

We still use Go for our backend services — it remains the right tool for that job. But for CLI utilities where predictable performance and zero-cost abstractions matter, Rust has earned its place in our toolbox.