Local LLMs versus offline Wikipedia
Two days ago, MIT Technology review published “How to run an LLM on your laptop”. It opens with an anecdote about using offline LLMs in an apocalypse scenario. “‘It’s like having a weird, condensed, faulty version of Wikipedia, so I can help reboot society with the help of my little USB stick,’ [Simon Willison] says.” This made me wonder: how do the sizes of local LLMs compare to the size of offline Wikipedia downloads? I compared some models from the Ollama library to various downloads on Kiw