codeburn-rs: CodeBurn but 600x faster in Rust
A developer tool that meticulously tracks AI coding token usage across various models (e.g., Claude, Copilot, Codex). The core differentiator is its implementation in Rust, providing massive performance gains (up to 610x faster) over its JavaScript counterpart.
livecodeburn-rs
TaglineCodeBurn but 600x faster in Rust
Platformweb
CategoryDeveloper Tools · DevOps
Visitgithub.com
Source
The landscape of AI-assisted coding is rapidly expanding, but with increased usage comes a measurable cost: API tokens. `codeburn-rs` addresses this transparency gap by providing a dedicated tool to monitor and visualize AI coding token consumption. It’s a necessary utility for teams moving from experimental AI use to mission-critical development workflows, offering critical cost awareness that simply isn't visible in standard IDE/platform reporting.
What sets `codeburn-rs` apart is its engineering choice. By rewriting the functionality in Rust, the developers solved the performance bottlenecks that plagued earlier iterations of the tool. The provided benchmarks are telling: caching improved output from 6.0ms down to a mere 10.9ms for cached sources, and cold-run usage showed a speedup factor of approximately 101x over the JavaScript version. For any utility that processes potentially large volumes of token data or runs frequently in a CI/CD pipeline, this performance difference is not merely an improvement—it's a necessity for professional adoption.
Functionally, the tool is robust. It supports multiple major AI providers, including Claude, Codex, OpenAI, Pi, and Copilot. The command-line interface is well-designed, offering specific subcommands like `cburn today`, `cburn month`, and `cburn report`. The ability to filter reports by provider (e.g., `cburn report --provider claude`) or specify arbitrary date periods adds immense value, turning raw usage data into actionable financial and technical insights.
For data analysts or team leads, the `cburn export --format csv` feature is invaluable, allowing the detailed consumption data to be plugged directly into BI tools or custom dashboards. Furthermore, the commitment to a native binary distribution via Homebrew or direct `cargo install` makes integration into developer tooling straightforward. It moves token usage tracking from a simple dashboard view into a reliable, CLI-driven component of the DevOps stack.
Article Tags
indiedeveloper toolsdevops