Find Related products on Amazon

Shop on Amazon

Loading Pydantic models from JSON without running out of memory

Published on: 2025-06-26 08:06:37

You have a large JSON file, and you want to load the data into Pydantic. Unfortunately, this uses a lot of memory, to the point where large JSON files are very difficult to read. What to do? Assuming you’re stuck with JSON, in this article we’ll cover: The high memory usage you get with Pydantic’s default JSON loading. How to reduce memory usage by switching to another JSON library. Going further by switching to dataclasses with slots. The problem: 20× memory multiplier We’re going to start with a 100MB JSON file, and load it into Pydantic (v2.11.4). Here’s what our model looks like: from pydantic import BaseModel , RootModel class Name ( BaseModel ): first : str | None last : str | None class Customer ( BaseModel ): id : str name : Name notes : str # Map id to corresponding Customer: CustomerDirectory = RootModel [ dict [ str , Customer ]] The JSON we’re loading looks more or less like this: { " 123 " : { " id " : " 123 " , " name " : { " first " : " Itamar " , " last " : " T ... Read full article.