“Hello, world won’t compile.”
With a title like that, the first issue opened in CCC’s GitHub repository didn’t go unnoticed. It’s easy to see why the project itself drew so much attention. For good reason: Claude managed to create his own C compiler.
An engineer at Anthropic sparked the initiative. It took him about two weeks, roughly 2,000 Claude Code sessions, and nearly $20,000 in API costs to bring it to fruition, he explains. In the end, there are about 100,000 lines of Rust code… and the ability to compile Linux 6.9 on x86-64, i686, AArch64, and RISC-V 64, with no dependencies.
GCC as Oracle and a Linker That Still Falls Short
Compilation proceeds without errors (which is notable), but the assembly and linking—crucial components of a compiler—aren’t yet stable. Moreover, optimization levels still need to be implemented.
While human supervision was minimal (no debugging guidelines, notably, nor feedback on code quality), Claude was not entirely autonomous. Beyond the tests that kept it on track throughout the project, a synchronization algorithm prevented multiple agents from trying to solve the same problem at the same time.
CCC (Claude’s C Compiler) indeed exploited parallel instances of Claude Opus 4.6. The approach favored task specialization: one agent to merge the code in duplicate, a second to write the documentation, a third to analyze the project’s design from a Rust developer’s perspective, etc.
The algorithm in question imposes locks on tasks by writing text files in a current_tasks/ folder. Merge conflicts are frequent, but Claude knows how to handle them, we’re told. In each session, all agents have their own Docker container with a local copy of the Git repo.
This system worked for compiling “smaller” open-source projects (SQLite, QuickJS, mbedTLS, libpng…), each agent able to focus on one of them. With Linux, they eventually converged on the same task. And thus began to step on one another’s toes. The GCC compiler was then used as the oracle. All of this without an orchestrator: each agent decides its actions, documenting any failures.
A less efficient compilation…
Claude Opus 4.5 was the first Anthropic LLM capable of producing a compiler that passes the reference test suites, notes the engineer. Claude Opus 4.6’s contribution is scale, applied to a project as large as the Linux kernel.
However, the generated code isn’t very efficient, he admits. Even with all possible optimizations, it doesn’t reach what GCC delivers out of the box.
A third-party comparison confirms this. Its author analyzed, on the one hand, the Linux 6.9 compilation (x86-64). On the other, SQLite 3.46.0. His setup: two Debian VMs under Proxmox, each on its own physical node (6 vCPUs, 16 GB RAM, 100 GB NVMe).
With GCC 14.2.0, compiling SQLite takes 64.6 seconds. It takes 87 with CCC.
Without optimization, GCC produces a 1.55 MB binary. Against 4.27 MB for CCC. The former uses up to 272 MB of RAM; the latter, 1,616 MB.
… and above all, a much slower execution
The gap is far more pronounced in execution time: 10.3 seconds with GCC without optimization… versus 2 h 6 min with CCC. This slowness isn’t uniform. It’s smaller for simple requests like dropping tables or adding rows. It’s, on the contrary, much more significant for operations that involve nested loops.
This difference can be explained, among other things, by poor CPU register allocation (CCC scatters variables on the stack). The size of the generated code also matters: it aggravates instruction cache misses (the CPU can’t keep everything in L1/L2). Moreover, the production of corrupted pointers and the absence of symbol table generation make profiling and debugging impossible.
As for the kernel, CCC compiles all source files without error, but fails at the linker stage. In particular, it generates incorrect relocation entries for jump labels.