ARC-AGI without pretraining
Published on: 2025-07-04 14:52:38
By Isaac Liao and Albert Gu
In this blog post, we aim to answer a simple yet fundamental question:
Can lossless information compression by itself produce intelligent behavior?
The idea that efficient compression by itself lies at the heart of intelligence is not new (see, e.g., Hernández-Orallo & Minaya-Collado, 1998; Mahoney, 1999; Hutter, 2005; Legg & Hutter, 2007). Rather than revisiting those theoretical discussions, we make a practical demonstration instead.
In this work, we give evidence that lossless compression during inference time is sufficient to produce intelligent behavior, by developing a method purely based on compression that performs well on the ARC-AGI challenge, a dataset of IQ-test-like puzzles about inferring a procedure/rule from limited demonstrations. Crucially, our solution, which we name CompressARC, obeys the following three restrictions:
No pretraining ; models are randomly initialized and trained during inference time.
; models are randomly initia
... Read full article.